Apr 16 19:30:21.347349 ip-10-0-133-241 systemd[1]: Starting Kubernetes Kubelet... Apr 16 19:30:21.775310 ip-10-0-133-241 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:30:21.775310 ip-10-0-133-241 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 19:30:21.775310 ip-10-0-133-241 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:30:21.775310 ip-10-0-133-241 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 19:30:21.775310 ip-10-0-133-241 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:30:21.776463 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.776092 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 19:30:21.781280 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781258 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:30:21.781280 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781278 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:30:21.781280 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781283 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781287 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781292 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781296 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781300 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781304 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781310 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781315 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781319 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781323 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781327 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781333 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781338 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781342 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781346 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781350 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781354 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781358 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781362 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:30:21.781480 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781366 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781370 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781374 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781378 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781383 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781387 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781391 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781395 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781399 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781403 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781407 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781411 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781417 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781421 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781425 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781431 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781435 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781439 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781443 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781448 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781452 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:30:21.782322 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781456 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781461 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781466 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781470 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781474 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781478 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781482 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781486 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781491 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781495 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781500 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781505 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781509 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781513 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781517 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781521 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781525 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781530 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781535 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:30:21.783030 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781539 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781543 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781547 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781552 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781556 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781560 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781564 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781569 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781573 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781578 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781583 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781587 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781591 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781596 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781600 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781604 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781608 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781614 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781618 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:30:21.783640 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781622 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781626 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781630 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781634 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781637 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.781642 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782272 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782281 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782285 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782290 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782294 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782299 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782304 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782308 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782313 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782317 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782320 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782324 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782327 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782331 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:30:21.784470 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782336 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782340 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782344 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782348 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782352 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782356 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782360 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782364 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782369 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782373 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782379 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782383 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782387 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782391 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782395 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782399 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782403 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782407 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782412 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:30:21.785133 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782416 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782420 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782425 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782429 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782434 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782438 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782441 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782446 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782453 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782459 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782464 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782469 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782474 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782479 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782483 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782488 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782494 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782499 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782503 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:30:21.785688 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782508 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782513 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782517 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782522 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782527 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782532 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782537 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782542 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782547 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782551 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782555 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782560 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782564 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782568 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782572 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782576 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782580 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782584 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782588 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782592 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:30:21.786271 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782596 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782600 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782605 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782609 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782613 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782617 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782622 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782626 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782632 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782639 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782643 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782648 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782652 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.782656 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783383 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783398 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783416 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783424 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783436 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783442 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783449 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 19:30:21.786907 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783455 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783461 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783465 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783471 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783476 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783481 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783486 2575 flags.go:64] FLAG: --cgroup-root="" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783491 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783496 2575 flags.go:64] FLAG: --client-ca-file="" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783501 2575 flags.go:64] FLAG: --cloud-config="" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783505 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783510 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783515 2575 flags.go:64] FLAG: --cluster-domain="" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783520 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783526 2575 flags.go:64] FLAG: --config-dir="" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783530 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783536 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783542 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783547 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783553 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783558 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783563 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783568 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783573 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783579 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 19:30:21.787483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783584 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783591 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783596 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783600 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783607 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783613 2575 flags.go:64] FLAG: --enable-server="true" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783618 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783625 2575 flags.go:64] FLAG: --event-burst="100" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783630 2575 flags.go:64] FLAG: --event-qps="50" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783634 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783639 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783644 2575 flags.go:64] FLAG: --eviction-hard="" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783650 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783654 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783658 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783663 2575 flags.go:64] FLAG: --eviction-soft="" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783668 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783673 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783678 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783683 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783687 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783692 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783697 2575 flags.go:64] FLAG: --feature-gates="" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783703 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783708 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 19:30:21.788167 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783714 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783719 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783724 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783730 2575 flags.go:64] FLAG: --help="false" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783734 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-133-241.ec2.internal" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783739 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783744 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783749 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783754 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783761 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783765 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783772 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783777 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783782 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783787 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783793 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783797 2575 flags.go:64] FLAG: --kube-reserved="" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783802 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783807 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783812 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783817 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783821 2575 flags.go:64] FLAG: --lock-file="" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783826 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783831 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 19:30:21.788838 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783836 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783845 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783850 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783855 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783860 2575 flags.go:64] FLAG: --logging-format="text" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783864 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783870 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783874 2575 flags.go:64] FLAG: --manifest-url="" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783879 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783886 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783891 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783898 2575 flags.go:64] FLAG: --max-pods="110" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783903 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783907 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783913 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783917 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783922 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783927 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783932 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783945 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783950 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783955 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783961 2575 flags.go:64] FLAG: --pod-cidr="" Apr 16 19:30:21.789424 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783966 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783975 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783980 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783985 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783990 2575 flags.go:64] FLAG: --port="10250" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.783995 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784000 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-051670737500e8ca4" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784008 2575 flags.go:64] FLAG: --qos-reserved="" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784013 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784018 2575 flags.go:64] FLAG: --register-node="true" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784023 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784027 2575 flags.go:64] FLAG: --register-with-taints="" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784033 2575 flags.go:64] FLAG: --registry-burst="10" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784038 2575 flags.go:64] FLAG: --registry-qps="5" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784042 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784064 2575 flags.go:64] FLAG: --reserved-memory="" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784072 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784077 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784082 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784087 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784091 2575 flags.go:64] FLAG: --runonce="false" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784096 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784101 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784106 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784111 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784116 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 19:30:21.790022 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784121 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784126 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784131 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784137 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784142 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784146 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784153 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784159 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784164 2575 flags.go:64] FLAG: --system-cgroups="" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784169 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784177 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784182 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784187 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784198 2575 flags.go:64] FLAG: --tls-min-version="" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784203 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784207 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784212 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784218 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784222 2575 flags.go:64] FLAG: --v="2" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784229 2575 flags.go:64] FLAG: --version="false" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784235 2575 flags.go:64] FLAG: --vmodule="" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784242 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.784248 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784391 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784397 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:30:21.790721 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784402 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784407 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784412 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784416 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784420 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784425 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784430 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784434 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784438 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784443 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784451 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784455 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784459 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784464 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784469 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784473 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784477 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784481 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784485 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784491 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:30:21.791343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784498 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784502 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784506 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784510 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784514 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784519 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784523 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784527 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784531 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784535 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784539 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784543 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784548 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784552 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784556 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784560 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784564 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784568 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784572 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784576 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:30:21.791842 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784580 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784584 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784589 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784593 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784597 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784601 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784605 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784610 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784614 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784619 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784623 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784627 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784632 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784636 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784643 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784649 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784661 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784666 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784670 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:30:21.792344 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784674 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784678 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784682 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784686 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784691 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784695 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784699 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784703 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784707 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784711 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784716 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784719 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784724 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784728 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784732 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784736 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784741 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784745 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784750 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784754 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:30:21.792816 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784758 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:30:21.793357 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784772 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:30:21.793357 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784777 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:30:21.793357 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784782 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:30:21.793357 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.784786 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:30:21.793357 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.785474 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:30:21.793357 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.792259 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 19:30:21.793357 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.792274 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 19:30:21.793357 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792326 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:30:21.793357 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792332 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:30:21.793357 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792336 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:30:21.793357 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792340 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:30:21.793357 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792344 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:30:21.793357 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792349 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:30:21.793357 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792352 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:30:21.793357 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792355 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:30:21.793357 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792358 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792361 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792363 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792366 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792369 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792372 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792374 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792377 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792380 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792382 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792386 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792388 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792391 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792393 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792396 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792399 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792403 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792405 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792408 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:30:21.793768 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792410 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792413 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792416 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792418 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792422 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792425 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792428 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792430 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792433 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792435 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792438 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792440 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792443 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792446 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792448 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792451 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792453 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792455 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792458 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792461 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:30:21.794253 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792463 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792466 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792468 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792471 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792474 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792476 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792479 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792481 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792484 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792486 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792489 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792491 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792494 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792496 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792498 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792503 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792507 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792511 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792513 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:30:21.794735 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792516 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792519 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792522 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792524 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792527 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792530 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792532 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792535 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792538 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792540 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792543 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792546 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792548 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792550 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792553 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792555 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792558 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792561 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792563 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:30:21.795228 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792565 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:30:21.795700 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.792570 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:30:21.795700 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792674 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:30:21.795700 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792680 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:30:21.795700 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792683 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:30:21.795700 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792686 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:30:21.795700 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792689 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:30:21.795700 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792692 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:30:21.795700 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792694 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:30:21.795700 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792697 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:30:21.795700 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792700 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:30:21.795700 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792703 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:30:21.795700 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792712 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:30:21.795700 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792714 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:30:21.795700 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792717 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:30:21.795700 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792720 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:30:21.795700 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792722 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792726 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792730 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792733 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792736 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792738 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792741 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792743 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792746 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792748 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792751 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792753 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792756 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792758 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792761 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792763 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792766 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792768 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792771 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:30:21.796112 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792773 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792775 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792778 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792781 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792783 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792786 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792788 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792791 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792794 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792796 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792804 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792807 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792809 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792811 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792814 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792816 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792819 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792822 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792825 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792827 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:30:21.796576 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792830 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792832 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792835 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792838 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792840 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792842 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792845 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792847 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792850 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792853 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792855 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792858 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792860 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792862 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792865 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792867 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792870 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792872 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792875 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:30:21.797082 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792877 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:30:21.797539 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792880 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:30:21.797539 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792882 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:30:21.797539 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792885 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:30:21.797539 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792894 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:30:21.797539 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792896 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:30:21.797539 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792899 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:30:21.797539 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792901 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:30:21.797539 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792905 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:30:21.797539 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792908 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:30:21.797539 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792911 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:30:21.797539 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792914 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:30:21.797539 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792917 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:30:21.797539 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:21.792919 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:30:21.797539 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.792924 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:30:21.797539 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.793600 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 19:30:21.797892 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.796312 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 19:30:21.797892 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.797376 2575 server.go:1019] "Starting client certificate rotation" Apr 16 19:30:21.797892 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.797466 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:30:21.797892 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.797504 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:30:21.821082 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.821045 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:30:21.823370 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.823354 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:30:21.837117 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.837097 2575 log.go:25] "Validated CRI v1 runtime API" Apr 16 19:30:21.842579 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.842565 2575 log.go:25] "Validated CRI v1 image API" Apr 16 19:30:21.843840 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.843820 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 19:30:21.847266 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.847247 2575 fs.go:135] Filesystem UUIDs: map[53d79f10-763e-45f0-96e9-9953449ebbe4:/dev/nvme0n1p3 789c2fa0-ba3a-4a07-8b1c-8855514952ed:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 19:30:21.847325 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.847266 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 19:30:21.852647 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.852531 2575 manager.go:217] Machine: {Timestamp:2026-04-16 19:30:21.850806313 +0000 UTC m=+0.385820752 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3072062 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec251a2b10dc964b6eb7d1f94f8a17a6 SystemUUID:ec251a2b-10dc-964b-6eb7-d1f94f8a17a6 BootID:3848edc0-e62d-40e0-8ce4-69b33e2062c5 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:0f:5e:d1:57:23 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:0f:5e:d1:57:23 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ce:a9:d2:a3:26:15 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 19:30:21.852647 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.852641 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 19:30:21.852766 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.852724 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 19:30:21.852842 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.852827 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:30:21.854244 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.854224 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 19:30:21.854368 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.854246 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-241.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 19:30:21.854417 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.854378 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 19:30:21.854417 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.854387 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 19:30:21.854417 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.854400 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:30:21.855956 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.855942 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:30:21.857260 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.857249 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:30:21.857361 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.857351 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 19:30:21.859635 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.859624 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 16 19:30:21.859670 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.859645 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 19:30:21.859670 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.859661 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 19:30:21.859670 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.859669 2575 kubelet.go:397] "Adding apiserver pod source" Apr 16 19:30:21.859790 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.859678 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 19:30:21.860839 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.860825 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:30:21.860876 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.860852 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:30:21.863504 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.863487 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 19:30:21.864880 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.864860 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 19:30:21.866647 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.866635 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 19:30:21.866704 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.866653 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 19:30:21.866704 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.866659 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 19:30:21.866704 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.866664 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 19:30:21.866704 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.866670 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 19:30:21.866704 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.866676 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 19:30:21.866704 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.866681 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 19:30:21.866704 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.866686 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 19:30:21.866704 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.866693 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 19:30:21.866704 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.866699 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 19:30:21.866929 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.866722 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 19:30:21.866929 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.866733 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 19:30:21.867490 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.867481 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 19:30:21.867490 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.867491 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 19:30:21.870338 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.870322 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-241.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 19:30:21.870521 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:21.870503 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 19:30:21.870556 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:21.870506 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-241.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 19:30:21.870957 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.870945 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 19:30:21.870988 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.870982 2575 server.go:1295] "Started kubelet" Apr 16 19:30:21.871088 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.871063 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 19:30:21.871198 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.871080 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 19:30:21.871198 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.871169 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 19:30:21.871813 ip-10-0-133-241 systemd[1]: Started Kubernetes Kubelet. Apr 16 19:30:21.872162 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.872146 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 19:30:21.873032 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.873018 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 16 19:30:21.873731 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.873713 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vhwkd" Apr 16 19:30:21.878919 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.878901 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 19:30:21.878919 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.878917 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 19:30:21.879866 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:21.879831 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-241.ec2.internal\" not found" Apr 16 19:30:21.879866 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.879837 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 19:30:21.880444 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.880398 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 19:30:21.880575 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.880564 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 19:30:21.880726 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.880704 2575 factory.go:55] Registering systemd factory Apr 16 19:30:21.880798 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.880737 2575 factory.go:223] Registration of the systemd container factory successfully Apr 16 19:30:21.881300 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:21.881273 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 19:30:21.882205 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.882184 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 16 19:30:21.882205 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.882204 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 16 19:30:21.882442 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.882427 2575 factory.go:153] Registering CRI-O factory Apr 16 19:30:21.882502 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.882447 2575 factory.go:223] Registration of the crio container factory successfully Apr 16 19:30:21.882502 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.882499 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 19:30:21.882594 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.882524 2575 factory.go:103] Registering Raw factory Apr 16 19:30:21.882594 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.882539 2575 manager.go:1196] Started watching for new ooms in manager Apr 16 19:30:21.882947 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.882924 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vhwkd" Apr 16 19:30:21.883144 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.883128 2575 manager.go:319] Starting recovery of all containers Apr 16 19:30:21.884735 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:21.884706 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 19:30:21.884830 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:21.884792 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-241.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 19:30:21.887790 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:21.884770 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-241.ec2.internal.18a6ed1f550d894c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-241.ec2.internal,UID:ip-10-0-133-241.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-241.ec2.internal,},FirstTimestamp:2026-04-16 19:30:21.870958924 +0000 UTC m=+0.405973363,LastTimestamp:2026-04-16 19:30:21.870958924 +0000 UTC m=+0.405973363,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-241.ec2.internal,}" Apr 16 19:30:21.896535 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.896416 2575 manager.go:324] Recovery completed Apr 16 19:30:21.900488 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.900475 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:30:21.902829 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.902811 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:30:21.902907 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.902848 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:30:21.902907 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.902878 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:30:21.903411 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.903392 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 19:30:21.903411 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.903406 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 19:30:21.903557 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.903422 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:30:21.905831 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.905817 2575 policy_none.go:49] "None policy: Start" Apr 16 19:30:21.905831 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.905833 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 19:30:21.905924 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.905843 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 16 19:30:21.943586 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.943572 2575 manager.go:341] "Starting Device Plugin manager" Apr 16 19:30:21.950237 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:21.943600 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 19:30:21.950237 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.943609 2575 server.go:85] "Starting device plugin registration server" Apr 16 19:30:21.950237 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.943899 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 19:30:21.950237 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.943911 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 19:30:21.950237 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.943998 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 19:30:21.950237 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.944083 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 19:30:21.950237 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:21.944091 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 19:30:21.950237 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:21.944675 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 19:30:21.950237 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:21.944715 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-241.ec2.internal\" not found" Apr 16 19:30:22.012802 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.012767 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 19:30:22.013959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.013940 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 19:30:22.014040 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.013973 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 19:30:22.014040 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.013993 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 19:30:22.014040 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.014004 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 19:30:22.014198 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:22.014042 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 19:30:22.017829 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.017813 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:30:22.044450 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.044415 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:30:22.045164 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.045151 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:30:22.045239 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.045177 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:30:22.045239 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.045188 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:30:22.045239 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.045208 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-241.ec2.internal" Apr 16 19:30:22.053777 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.053758 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-241.ec2.internal" Apr 16 19:30:22.053777 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:22.053777 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-241.ec2.internal\": node \"ip-10-0-133-241.ec2.internal\" not found" Apr 16 19:30:22.072016 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:22.071967 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-241.ec2.internal\" not found" Apr 16 19:30:22.114304 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.114267 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-241.ec2.internal"] Apr 16 19:30:22.114390 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.114339 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:30:22.115105 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.115089 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:30:22.115186 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.115118 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:30:22.115186 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.115133 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:30:22.116496 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.116482 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:30:22.116640 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.116627 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal" Apr 16 19:30:22.116700 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.116653 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:30:22.117172 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.117156 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:30:22.117238 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.117183 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:30:22.117238 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.117197 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:30:22.117911 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.117897 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:30:22.117984 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.117925 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:30:22.117984 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.117939 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:30:22.118384 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.118370 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-241.ec2.internal" Apr 16 19:30:22.118437 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.118393 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:30:22.118965 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.118952 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:30:22.119032 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.118975 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:30:22.119032 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.118984 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:30:22.142566 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:22.142544 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-241.ec2.internal\" not found" node="ip-10-0-133-241.ec2.internal" Apr 16 19:30:22.145774 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:22.145758 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-241.ec2.internal\" not found" node="ip-10-0-133-241.ec2.internal" Apr 16 19:30:22.172056 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:22.172037 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-241.ec2.internal\" not found" Apr 16 19:30:22.183893 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.183874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/76b29a18b5fb21662a973de3f8f2bd10-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal\" (UID: \"76b29a18b5fb21662a973de3f8f2bd10\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal" Apr 16 19:30:22.183974 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.183903 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/76b29a18b5fb21662a973de3f8f2bd10-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal\" (UID: \"76b29a18b5fb21662a973de3f8f2bd10\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal" Apr 16 19:30:22.183974 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.183928 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8012d096d8d919883fff307930899c9a-config\") pod \"kube-apiserver-proxy-ip-10-0-133-241.ec2.internal\" (UID: \"8012d096d8d919883fff307930899c9a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-241.ec2.internal" Apr 16 19:30:22.272282 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:22.272262 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-241.ec2.internal\" not found" Apr 16 19:30:22.284699 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.284673 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/76b29a18b5fb21662a973de3f8f2bd10-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal\" (UID: \"76b29a18b5fb21662a973de3f8f2bd10\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal" Apr 16 19:30:22.284762 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.284706 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/76b29a18b5fb21662a973de3f8f2bd10-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal\" (UID: \"76b29a18b5fb21662a973de3f8f2bd10\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal" Apr 16 19:30:22.284762 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.284722 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8012d096d8d919883fff307930899c9a-config\") pod \"kube-apiserver-proxy-ip-10-0-133-241.ec2.internal\" (UID: \"8012d096d8d919883fff307930899c9a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-241.ec2.internal" Apr 16 19:30:22.284831 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.284766 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8012d096d8d919883fff307930899c9a-config\") pod \"kube-apiserver-proxy-ip-10-0-133-241.ec2.internal\" (UID: \"8012d096d8d919883fff307930899c9a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-241.ec2.internal" Apr 16 19:30:22.284831 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.284769 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/76b29a18b5fb21662a973de3f8f2bd10-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal\" (UID: \"76b29a18b5fb21662a973de3f8f2bd10\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal" Apr 16 19:30:22.284831 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.284786 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/76b29a18b5fb21662a973de3f8f2bd10-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal\" (UID: \"76b29a18b5fb21662a973de3f8f2bd10\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal" Apr 16 19:30:22.373108 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:22.373064 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-241.ec2.internal\" not found" Apr 16 19:30:22.444623 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.444601 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal" Apr 16 19:30:22.449074 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.449042 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-241.ec2.internal" Apr 16 19:30:22.473742 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:22.473710 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-241.ec2.internal\" not found" Apr 16 19:30:22.574219 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:22.574195 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-241.ec2.internal\" not found" Apr 16 19:30:22.674784 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:22.674734 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-241.ec2.internal\" not found" Apr 16 19:30:22.768565 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.768546 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:30:22.775453 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:22.775434 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-241.ec2.internal\" not found" Apr 16 19:30:22.796960 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.796941 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 19:30:22.797330 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.797035 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:30:22.797330 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.797089 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:30:22.876118 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:22.876095 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-241.ec2.internal\" not found" Apr 16 19:30:22.879187 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.879165 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 19:30:22.888304 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.888277 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 19:25:21 +0000 UTC" deadline="2028-01-31 18:08:58.45843793 +0000 UTC" Apr 16 19:30:22.888304 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.888302 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15718h38m35.570139302s" Apr 16 19:30:22.890719 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.890702 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:30:22.913357 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.913338 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-68m4b" Apr 16 19:30:22.923989 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.923967 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-68m4b" Apr 16 19:30:22.941343 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:22.941320 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76b29a18b5fb21662a973de3f8f2bd10.slice/crio-63e8d38084c067953c32d18cf1fc13c37abe189f172b74611bcea67cc5cab834 WatchSource:0}: Error finding container 63e8d38084c067953c32d18cf1fc13c37abe189f172b74611bcea67cc5cab834: Status 404 returned error can't find the container with id 63e8d38084c067953c32d18cf1fc13c37abe189f172b74611bcea67cc5cab834 Apr 16 19:30:22.941804 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:22.941785 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8012d096d8d919883fff307930899c9a.slice/crio-906596282b01b9a6194120b1475b5d880b3d545f8a965f22d905d753c635177a WatchSource:0}: Error finding container 906596282b01b9a6194120b1475b5d880b3d545f8a965f22d905d753c635177a: Status 404 returned error can't find the container with id 906596282b01b9a6194120b1475b5d880b3d545f8a965f22d905d753c635177a Apr 16 19:30:22.945848 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:22.945834 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:30:22.976179 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:22.976157 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-241.ec2.internal\" not found" Apr 16 19:30:23.016246 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.016206 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-241.ec2.internal" event={"ID":"8012d096d8d919883fff307930899c9a","Type":"ContainerStarted","Data":"906596282b01b9a6194120b1475b5d880b3d545f8a965f22d905d753c635177a"} Apr 16 19:30:23.017213 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.017190 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal" event={"ID":"76b29a18b5fb21662a973de3f8f2bd10","Type":"ContainerStarted","Data":"63e8d38084c067953c32d18cf1fc13c37abe189f172b74611bcea67cc5cab834"} Apr 16 19:30:23.076358 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:23.076338 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-241.ec2.internal\" not found" Apr 16 19:30:23.133303 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.133283 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:30:23.179179 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.179160 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal" Apr 16 19:30:23.190317 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.190270 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:30:23.192205 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.192191 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-241.ec2.internal" Apr 16 19:30:23.200754 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.200740 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:30:23.339844 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.339800 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:30:23.860730 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.860700 2575 apiserver.go:52] "Watching apiserver" Apr 16 19:30:23.869007 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.868791 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 19:30:23.871163 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.871137 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-vwbhq","kube-system/kube-apiserver-proxy-ip-10-0-133-241.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh","openshift-image-registry/node-ca-5f289","openshift-network-diagnostics/network-check-target-brgng","openshift-ovn-kubernetes/ovnkube-node-gxnqw","openshift-cluster-node-tuning-operator/tuned-ckws9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal","openshift-multus/multus-7vjxb","openshift-multus/multus-additional-cni-plugins-sslrf","openshift-multus/network-metrics-daemon-lxtld","openshift-network-operator/iptables-alerter-srwtb"] Apr 16 19:30:23.872666 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.872640 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vwbhq" Apr 16 19:30:23.873804 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.873775 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.874971 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.874950 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5f289" Apr 16 19:30:23.875159 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.875140 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 19:30:23.875495 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.875476 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-mpp7w\"" Apr 16 19:30:23.875495 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.875489 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 19:30:23.876268 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.876237 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 19:30:23.876455 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.876436 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 19:30:23.876531 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.876481 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-67p2p\"" Apr 16 19:30:23.876681 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.876663 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 19:30:23.877529 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.877368 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 19:30:23.877529 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.877383 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 19:30:23.877693 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.877536 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 19:30:23.877693 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.877540 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4kztg\"" Apr 16 19:30:23.878285 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.878257 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:23.878376 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:23.878330 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-brgng" podUID="73ec7853-4c17-4a50-b708-9ba3534b6b45" Apr 16 19:30:23.878465 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.878386 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.879889 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.879820 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.880538 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.880522 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 19:30:23.880629 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.880517 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 19:30:23.881289 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.881272 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.881905 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.881887 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:30:23.882027 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.882002 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dkf26\"" Apr 16 19:30:23.882112 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.882074 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 19:30:23.882190 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.882176 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 19:30:23.882291 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.882276 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 19:30:23.882409 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.882394 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 19:30:23.882521 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.882504 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2zhxj\"" Apr 16 19:30:23.882580 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.882509 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 19:30:23.882974 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.882955 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:23.883457 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.883437 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 19:30:23.883551 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.883503 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 19:30:23.883723 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.883705 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 19:30:23.884184 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.884166 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-fqv6z\"" Apr 16 19:30:23.884271 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.884169 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 19:30:23.884589 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.884568 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:23.884675 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:23.884648 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lxtld" podUID="046d5342-ca0b-4fe3-b388-6fa9f477de08" Apr 16 19:30:23.885278 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.885241 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 19:30:23.885373 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.885355 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 19:30:23.885448 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.885416 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-m7qsg\"" Apr 16 19:30:23.886090 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.886068 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-srwtb" Apr 16 19:30:23.888975 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.888515 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:30:23.888975 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.888545 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 19:30:23.888975 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.888602 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8zsmq\"" Apr 16 19:30:23.888975 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.888517 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 19:30:23.892631 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.892608 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-system-cni-dir\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.892752 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.892644 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-systemd-units\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.892752 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.892681 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-cni-netd\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.892752 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.892704 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3bbc206b-84a4-45f8-9836-82284b580174-cni-binary-copy\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.892752 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.892730 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-host-run-netns\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.892918 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.892755 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3bbc206b-84a4-45f8-9836-82284b580174-multus-daemon-config\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.892918 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.892826 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/be225946-41f7-4fe4-8421-be88c9efe965-agent-certs\") pod \"konnectivity-agent-vwbhq\" (UID: \"be225946-41f7-4fe4-8421-be88c9efe965\") " pod="kube-system/konnectivity-agent-vwbhq" Apr 16 19:30:23.892918 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.892872 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-host-run-k8s-cni-cncf-io\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.893063 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.892908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-run-openvswitch\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.893063 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.892951 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-host-run-multus-certs\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.893063 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.892977 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-log-socket\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.893063 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893004 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-lib-modules\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.893220 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893046 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-hostroot\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.893220 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893097 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vglvt\" (UniqueName: \"kubernetes.io/projected/16c561dd-93b3-4b83-9374-3a46663b8962-kube-api-access-vglvt\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.893220 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-sysctl-d\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.893220 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893144 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-device-dir\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.893220 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893189 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-cnibin\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.893220 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893209 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-var-lib-kubelet\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.893468 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893224 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-kubelet\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.893468 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893246 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-run-netns\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.893468 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893266 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-etc-openvswitch\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.893468 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893285 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-multus-cni-dir\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.893468 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893308 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-host-var-lib-kubelet\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.893468 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92q6k\" (UniqueName: \"kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k\") pod \"network-check-target-brgng\" (UID: \"73ec7853-4c17-4a50-b708-9ba3534b6b45\") " pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:23.893468 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893364 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/16c561dd-93b3-4b83-9374-3a46663b8962-ovnkube-config\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.893468 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893379 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/16c561dd-93b3-4b83-9374-3a46663b8962-ovn-node-metrics-cert\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.893468 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893402 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/16c561dd-93b3-4b83-9374-3a46663b8962-env-overrides\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.893468 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893434 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-sysconfig\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.893468 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893471 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-sys\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.893941 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893516 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-etc-selinux\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.893941 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893542 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-sys-fs\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.893941 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893562 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e76a993d-4a08-49b0-a7f9-dc97575009ad-host\") pod \"node-ca-5f289\" (UID: \"e76a993d-4a08-49b0-a7f9-dc97575009ad\") " pod="openshift-image-registry/node-ca-5f289" Apr 16 19:30:23.893941 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893584 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-node-log\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.893941 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893606 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-tuned\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.893941 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893628 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-tmp\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.893941 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893650 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-os-release\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.893941 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893686 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-etc-kubernetes\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.893941 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893705 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25rzt\" (UniqueName: \"kubernetes.io/projected/3bbc206b-84a4-45f8-9836-82284b580174-kube-api-access-25rzt\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.893941 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893720 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-cni-bin\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.893941 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893766 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-systemd\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.893941 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893789 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-run\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.893941 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893811 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d45l\" (UniqueName: \"kubernetes.io/projected/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-kube-api-access-4d45l\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.893941 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893835 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/16c561dd-93b3-4b83-9374-3a46663b8962-ovnkube-script-lib\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.893941 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893866 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-registration-dir\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.893941 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893887 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e76a993d-4a08-49b0-a7f9-dc97575009ad-serviceca\") pod \"node-ca-5f289\" (UID: \"e76a993d-4a08-49b0-a7f9-dc97575009ad\") " pod="openshift-image-registry/node-ca-5f289" Apr 16 19:30:23.893941 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893914 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-host\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.894640 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893962 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-socket-dir\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.894640 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.893988 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-run-ovn\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.894640 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.894012 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-run-ovn-kubernetes\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.894640 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.894035 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-modprobe-d\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.894640 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.894075 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/be225946-41f7-4fe4-8421-be88c9efe965-konnectivity-ca\") pod \"konnectivity-agent-vwbhq\" (UID: \"be225946-41f7-4fe4-8421-be88c9efe965\") " pod="kube-system/konnectivity-agent-vwbhq" Apr 16 19:30:23.894640 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.894096 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.894640 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.894119 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-run-systemd\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.894640 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.894143 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-var-lib-openvswitch\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.894640 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.894164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-host-var-lib-cni-bin\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.894640 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.894203 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-multus-conf-dir\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.894640 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.894234 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-kubernetes\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.894640 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.894259 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn8k2\" (UniqueName: \"kubernetes.io/projected/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-kube-api-access-vn8k2\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.894640 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.894277 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-host-var-lib-cni-multus\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.894640 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.894301 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.894640 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.894324 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-sysctl-conf\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.894640 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.894341 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mz82\" (UniqueName: \"kubernetes.io/projected/e76a993d-4a08-49b0-a7f9-dc97575009ad-kube-api-access-8mz82\") pod \"node-ca-5f289\" (UID: \"e76a993d-4a08-49b0-a7f9-dc97575009ad\") " pod="openshift-image-registry/node-ca-5f289" Apr 16 19:30:23.895285 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.894354 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-multus-socket-dir-parent\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.895285 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.894376 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-slash\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.924554 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.924530 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:25:22 +0000 UTC" deadline="2027-09-29 09:54:32.891052893 +0000 UTC" Apr 16 19:30:23.924554 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.924552 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12734h24m8.966503963s" Apr 16 19:30:23.980671 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.980643 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 19:30:23.995175 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995154 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-host-run-k8s-cni-cncf-io\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.995296 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-run-openvswitch\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.995296 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995202 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c96c659-d972-4967-bf3d-e50d4088b9e5-os-release\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:23.995296 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-host-run-multus-certs\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.995296 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995242 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-log-socket\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.995296 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995274 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-lib-modules\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.995296 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995279 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-run-openvswitch\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.995545 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995279 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-host-run-k8s-cni-cncf-io\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.995545 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995327 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-hostroot\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.995545 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995352 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-log-socket\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.995545 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vglvt\" (UniqueName: \"kubernetes.io/projected/16c561dd-93b3-4b83-9374-3a46663b8962-kube-api-access-vglvt\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.995545 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995426 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-sysctl-d\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.995545 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995430 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-lib-modules\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.995545 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995435 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-hostroot\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.995545 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995471 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-device-dir\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.995545 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995490 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs\") pod \"network-metrics-daemon-lxtld\" (UID: \"046d5342-ca0b-4fe3-b388-6fa9f477de08\") " pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:23.995545 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995505 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-cnibin\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.995545 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995526 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-var-lib-kubelet\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.996025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-sysctl-d\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.996025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-kubelet\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.996025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995583 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-host-run-multus-certs\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.996025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995595 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-cnibin\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.996025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995592 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-run-netns\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.996025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995553 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-device-dir\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.996025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995629 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-kubelet\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.996025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995635 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-run-netns\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.996025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995640 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-etc-openvswitch\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.996025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995667 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-multus-cni-dir\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.996025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995693 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-etc-openvswitch\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.996025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995726 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-var-lib-kubelet\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.996025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995732 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-host-var-lib-kubelet\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.996025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995760 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92q6k\" (UniqueName: \"kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k\") pod \"network-check-target-brgng\" (UID: \"73ec7853-4c17-4a50-b708-9ba3534b6b45\") " pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:23.996025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995772 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-multus-cni-dir\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.996025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995783 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-host-var-lib-kubelet\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.996025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995785 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/16c561dd-93b3-4b83-9374-3a46663b8962-ovnkube-config\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.996025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995819 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/16c561dd-93b3-4b83-9374-3a46663b8962-ovn-node-metrics-cert\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.996959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/16c561dd-93b3-4b83-9374-3a46663b8962-env-overrides\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.996959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995879 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-sysconfig\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.996959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995903 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-sys\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.996959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995925 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-etc-selinux\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.996959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995947 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-sys-fs\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.996959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995968 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e76a993d-4a08-49b0-a7f9-dc97575009ad-host\") pod \"node-ca-5f289\" (UID: \"e76a993d-4a08-49b0-a7f9-dc97575009ad\") " pod="openshift-image-registry/node-ca-5f289" Apr 16 19:30:23.996959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995980 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-sysconfig\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.996959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.995998 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrwsr\" (UniqueName: \"kubernetes.io/projected/0c96c659-d972-4967-bf3d-e50d4088b9e5-kube-api-access-xrwsr\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:23.996959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-node-log\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.996959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996042 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e76a993d-4a08-49b0-a7f9-dc97575009ad-host\") pod \"node-ca-5f289\" (UID: \"e76a993d-4a08-49b0-a7f9-dc97575009ad\") " pod="openshift-image-registry/node-ca-5f289" Apr 16 19:30:23.996959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996074 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-tuned\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.996959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996100 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-tmp\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.996959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996104 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-node-log\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.996959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996121 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-sys-fs\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.996959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996125 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c96c659-d972-4967-bf3d-e50d4088b9e5-system-cni-dir\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:23.996959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996154 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-sys\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.996959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996221 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-etc-selinux\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.996959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996307 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-os-release\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.997832 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996338 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-etc-kubernetes\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.997832 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996381 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25rzt\" (UniqueName: \"kubernetes.io/projected/3bbc206b-84a4-45f8-9836-82284b580174-kube-api-access-25rzt\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.997832 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996395 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/16c561dd-93b3-4b83-9374-3a46663b8962-env-overrides\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.997832 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996409 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-cni-bin\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.997832 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996455 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/16c561dd-93b3-4b83-9374-3a46663b8962-ovnkube-config\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.997832 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996473 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-systemd\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.997832 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996454 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 19:30:23.997832 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996499 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-run\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.997832 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4d45l\" (UniqueName: \"kubernetes.io/projected/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-kube-api-access-4d45l\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.997832 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996562 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-677qw\" (UniqueName: \"kubernetes.io/projected/505f35a8-3d21-4bfe-89d5-dd9c95882f3c-kube-api-access-677qw\") pod \"iptables-alerter-srwtb\" (UID: \"505f35a8-3d21-4bfe-89d5-dd9c95882f3c\") " pod="openshift-network-operator/iptables-alerter-srwtb" Apr 16 19:30:23.997832 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996611 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/16c561dd-93b3-4b83-9374-3a46663b8962-ovnkube-script-lib\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.997832 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996639 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-registration-dir\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.997832 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996638 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-run\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.997832 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996673 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e76a993d-4a08-49b0-a7f9-dc97575009ad-serviceca\") pod \"node-ca-5f289\" (UID: \"e76a993d-4a08-49b0-a7f9-dc97575009ad\") " pod="openshift-image-registry/node-ca-5f289" Apr 16 19:30:23.997832 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996825 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c96c659-d972-4967-bf3d-e50d4088b9e5-cni-binary-copy\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:23.997832 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996512 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-etc-kubernetes\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.997832 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996864 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c96c659-d972-4967-bf3d-e50d4088b9e5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:23.997832 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.996477 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-os-release\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.998675 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.997787 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-cni-bin\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.998675 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.997885 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e76a993d-4a08-49b0-a7f9-dc97575009ad-serviceca\") pod \"node-ca-5f289\" (UID: \"e76a993d-4a08-49b0-a7f9-dc97575009ad\") " pod="openshift-image-registry/node-ca-5f289" Apr 16 19:30:23.998675 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.997880 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-systemd\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.998675 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.997918 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-host\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.998675 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.997982 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-registration-dir\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.998675 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.998636 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/16c561dd-93b3-4b83-9374-3a46663b8962-ovnkube-script-lib\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.998675 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.998677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-socket-dir\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.999013 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.998838 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-run-ovn\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.999013 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.998897 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-run-ovn-kubernetes\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.999013 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.998950 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-modprobe-d\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.999013 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.998988 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/be225946-41f7-4fe4-8421-be88c9efe965-konnectivity-ca\") pod \"konnectivity-agent-vwbhq\" (UID: \"be225946-41f7-4fe4-8421-be88c9efe965\") " pod="kube-system/konnectivity-agent-vwbhq" Apr 16 19:30:23.999223 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999090 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.999223 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999131 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c96c659-d972-4967-bf3d-e50d4088b9e5-cnibin\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:23.999223 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999173 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-run-systemd\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.999374 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999240 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-var-lib-openvswitch\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.999374 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999276 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-host-var-lib-cni-bin\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.999472 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999433 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-var-lib-openvswitch\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.999538 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999480 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-multus-conf-dir\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.999538 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999526 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-kubernetes\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.999633 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999565 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0c96c659-d972-4967-bf3d-e50d4088b9e5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:23.999689 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vn8k2\" (UniqueName: \"kubernetes.io/projected/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-kube-api-access-vn8k2\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.999746 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999680 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-host-var-lib-cni-multus\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.999746 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999712 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.999837 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999754 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:23.999837 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-sysctl-conf\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:23.999837 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-host-var-lib-cni-bin\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:23.999976 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mz82\" (UniqueName: \"kubernetes.io/projected/e76a993d-4a08-49b0-a7f9-dc97575009ad-kube-api-access-8mz82\") pod \"node-ca-5f289\" (UID: \"e76a993d-4a08-49b0-a7f9-dc97575009ad\") " pod="openshift-image-registry/node-ca-5f289" Apr 16 19:30:23.999976 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999891 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:23.999976 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999913 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/505f35a8-3d21-4bfe-89d5-dd9c95882f3c-host-slash\") pod \"iptables-alerter-srwtb\" (UID: \"505f35a8-3d21-4bfe-89d5-dd9c95882f3c\") " pod="openshift-network-operator/iptables-alerter-srwtb" Apr 16 19:30:23.999976 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999953 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-multus-socket-dir-parent\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:24.000201 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:23.999990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-slash\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:24.000253 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.000209 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-run-systemd\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:24.000299 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.000259 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-multus-conf-dir\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:24.000346 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.000311 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-kubernetes\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:24.000414 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.000396 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-modprobe-d\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:24.000960 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.000931 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/be225946-41f7-4fe4-8421-be88c9efe965-konnectivity-ca\") pod \"konnectivity-agent-vwbhq\" (UID: \"be225946-41f7-4fe4-8421-be88c9efe965\") " pod="kube-system/konnectivity-agent-vwbhq" Apr 16 19:30:24.001041 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.001000 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-run-ovn\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:24.001106 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.001066 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-run-ovn-kubernetes\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:24.002032 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.001819 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-socket-dir\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:24.002032 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:24.001999 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:24.002032 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:24.002022 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:24.002195 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:24.002035 2575 projected.go:194] Error preparing data for projected volume kube-api-access-92q6k for pod openshift-network-diagnostics/network-check-target-brgng: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:24.002195 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:24.002132 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k podName:73ec7853-4c17-4a50-b708-9ba3534b6b45 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:24.502101304 +0000 UTC m=+3.037115749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-92q6k" (UniqueName: "kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k") pod "network-check-target-brgng" (UID: "73ec7853-4c17-4a50-b708-9ba3534b6b45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:24.003722 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.002001 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-multus-socket-dir-parent\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:24.004760 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.002483 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-sysctl-conf\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:24.004760 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.004746 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-host-var-lib-cni-multus\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:24.004949 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.004762 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-system-cni-dir\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:24.004949 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.004845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-systemd-units\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:24.004949 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.004907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-cni-netd\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:24.005182 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.004950 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr9pw\" (UniqueName: \"kubernetes.io/projected/046d5342-ca0b-4fe3-b388-6fa9f477de08-kube-api-access-dr9pw\") pod \"network-metrics-daemon-lxtld\" (UID: \"046d5342-ca0b-4fe3-b388-6fa9f477de08\") " pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:24.005182 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.004986 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/505f35a8-3d21-4bfe-89d5-dd9c95882f3c-iptables-alerter-script\") pod \"iptables-alerter-srwtb\" (UID: \"505f35a8-3d21-4bfe-89d5-dd9c95882f3c\") " pod="openshift-network-operator/iptables-alerter-srwtb" Apr 16 19:30:24.005182 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.005111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3bbc206b-84a4-45f8-9836-82284b580174-cni-binary-copy\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:24.005405 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.005200 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-host-run-netns\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:24.005405 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.005253 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3bbc206b-84a4-45f8-9836-82284b580174-multus-daemon-config\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:24.005405 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.005295 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/be225946-41f7-4fe4-8421-be88c9efe965-agent-certs\") pod \"konnectivity-agent-vwbhq\" (UID: \"be225946-41f7-4fe4-8421-be88c9efe965\") " pod="kube-system/konnectivity-agent-vwbhq" Apr 16 19:30:24.005405 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.005337 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c96c659-d972-4967-bf3d-e50d4088b9e5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:24.005723 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.005341 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vglvt\" (UniqueName: \"kubernetes.io/projected/16c561dd-93b3-4b83-9374-3a46663b8962-kube-api-access-vglvt\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:24.005723 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.005541 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-host-run-netns\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:24.005723 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.005606 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-slash\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:24.005723 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.005677 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3bbc206b-84a4-45f8-9836-82284b580174-system-cni-dir\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:24.005975 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.005727 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-systemd-units\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:24.005975 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.005771 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/16c561dd-93b3-4b83-9374-3a46663b8962-host-cni-netd\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:24.005975 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.005889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3bbc206b-84a4-45f8-9836-82284b580174-cni-binary-copy\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:24.006256 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.006065 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3bbc206b-84a4-45f8-9836-82284b580174-multus-daemon-config\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:24.006342 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.006275 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-host\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:24.006616 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.006597 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-tmp\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:24.007213 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.006986 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/16c561dd-93b3-4b83-9374-3a46663b8962-ovn-node-metrics-cert\") pod \"ovnkube-node-gxnqw\" (UID: \"16c561dd-93b3-4b83-9374-3a46663b8962\") " pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:24.008027 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.007427 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d45l\" (UniqueName: \"kubernetes.io/projected/b747b1c6-2c2b-4967-8dba-6539fe9fc5d2-kube-api-access-4d45l\") pod \"aws-ebs-csi-driver-node-v6bjh\" (UID: \"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:24.008027 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.007475 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25rzt\" (UniqueName: \"kubernetes.io/projected/3bbc206b-84a4-45f8-9836-82284b580174-kube-api-access-25rzt\") pod \"multus-7vjxb\" (UID: \"3bbc206b-84a4-45f8-9836-82284b580174\") " pod="openshift-multus/multus-7vjxb" Apr 16 19:30:24.008637 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.008580 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-etc-tuned\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:24.016788 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.010552 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/be225946-41f7-4fe4-8421-be88c9efe965-agent-certs\") pod \"konnectivity-agent-vwbhq\" (UID: \"be225946-41f7-4fe4-8421-be88c9efe965\") " pod="kube-system/konnectivity-agent-vwbhq" Apr 16 19:30:24.016788 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.010554 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mz82\" (UniqueName: \"kubernetes.io/projected/e76a993d-4a08-49b0-a7f9-dc97575009ad-kube-api-access-8mz82\") pod \"node-ca-5f289\" (UID: \"e76a993d-4a08-49b0-a7f9-dc97575009ad\") " pod="openshift-image-registry/node-ca-5f289" Apr 16 19:30:24.018213 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.018193 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn8k2\" (UniqueName: \"kubernetes.io/projected/8afc67d1-0264-4436-8d0e-ebcfa3237e7f-kube-api-access-vn8k2\") pod \"tuned-ckws9\" (UID: \"8afc67d1-0264-4436-8d0e-ebcfa3237e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:24.060931 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.060905 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:30:24.106602 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.106574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dr9pw\" (UniqueName: \"kubernetes.io/projected/046d5342-ca0b-4fe3-b388-6fa9f477de08-kube-api-access-dr9pw\") pod \"network-metrics-daemon-lxtld\" (UID: \"046d5342-ca0b-4fe3-b388-6fa9f477de08\") " pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:24.106720 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.106612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/505f35a8-3d21-4bfe-89d5-dd9c95882f3c-iptables-alerter-script\") pod \"iptables-alerter-srwtb\" (UID: \"505f35a8-3d21-4bfe-89d5-dd9c95882f3c\") " pod="openshift-network-operator/iptables-alerter-srwtb" Apr 16 19:30:24.106720 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.106640 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c96c659-d972-4967-bf3d-e50d4088b9e5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:24.106720 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.106666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c96c659-d972-4967-bf3d-e50d4088b9e5-os-release\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:24.106720 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.106696 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs\") pod \"network-metrics-daemon-lxtld\" (UID: \"046d5342-ca0b-4fe3-b388-6fa9f477de08\") " pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:24.106901 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.106744 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrwsr\" (UniqueName: \"kubernetes.io/projected/0c96c659-d972-4967-bf3d-e50d4088b9e5-kube-api-access-xrwsr\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:24.106901 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.106751 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c96c659-d972-4967-bf3d-e50d4088b9e5-os-release\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:24.106901 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.106773 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c96c659-d972-4967-bf3d-e50d4088b9e5-system-cni-dir\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:24.106901 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.106819 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-677qw\" (UniqueName: \"kubernetes.io/projected/505f35a8-3d21-4bfe-89d5-dd9c95882f3c-kube-api-access-677qw\") pod \"iptables-alerter-srwtb\" (UID: \"505f35a8-3d21-4bfe-89d5-dd9c95882f3c\") " pod="openshift-network-operator/iptables-alerter-srwtb" Apr 16 19:30:24.106901 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:24.106834 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:24.106901 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.106844 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c96c659-d972-4967-bf3d-e50d4088b9e5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:24.106901 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.106846 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c96c659-d972-4967-bf3d-e50d4088b9e5-cni-binary-copy\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:24.106901 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.106896 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c96c659-d972-4967-bf3d-e50d4088b9e5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:24.107195 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.106939 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c96c659-d972-4967-bf3d-e50d4088b9e5-cnibin\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:24.107195 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.106979 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0c96c659-d972-4967-bf3d-e50d4088b9e5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:24.107195 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:24.107003 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs podName:046d5342-ca0b-4fe3-b388-6fa9f477de08 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:24.606988667 +0000 UTC m=+3.142003093 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs") pod "network-metrics-daemon-lxtld" (UID: "046d5342-ca0b-4fe3-b388-6fa9f477de08") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:24.107195 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.107035 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/505f35a8-3d21-4bfe-89d5-dd9c95882f3c-host-slash\") pod \"iptables-alerter-srwtb\" (UID: \"505f35a8-3d21-4bfe-89d5-dd9c95882f3c\") " pod="openshift-network-operator/iptables-alerter-srwtb" Apr 16 19:30:24.107195 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.107134 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/505f35a8-3d21-4bfe-89d5-dd9c95882f3c-host-slash\") pod \"iptables-alerter-srwtb\" (UID: \"505f35a8-3d21-4bfe-89d5-dd9c95882f3c\") " pod="openshift-network-operator/iptables-alerter-srwtb" Apr 16 19:30:24.107195 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.107127 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c96c659-d972-4967-bf3d-e50d4088b9e5-cnibin\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:24.107477 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.107195 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c96c659-d972-4967-bf3d-e50d4088b9e5-system-cni-dir\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:24.107477 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.107263 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/505f35a8-3d21-4bfe-89d5-dd9c95882f3c-iptables-alerter-script\") pod \"iptables-alerter-srwtb\" (UID: \"505f35a8-3d21-4bfe-89d5-dd9c95882f3c\") " pod="openshift-network-operator/iptables-alerter-srwtb" Apr 16 19:30:24.107477 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.107414 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c96c659-d972-4967-bf3d-e50d4088b9e5-cni-binary-copy\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:24.107642 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.107557 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0c96c659-d972-4967-bf3d-e50d4088b9e5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:24.108222 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.108202 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c96c659-d972-4967-bf3d-e50d4088b9e5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:24.116830 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.116777 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrwsr\" (UniqueName: \"kubernetes.io/projected/0c96c659-d972-4967-bf3d-e50d4088b9e5-kube-api-access-xrwsr\") pod \"multus-additional-cni-plugins-sslrf\" (UID: \"0c96c659-d972-4967-bf3d-e50d4088b9e5\") " pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:24.116920 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.116845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-677qw\" (UniqueName: \"kubernetes.io/projected/505f35a8-3d21-4bfe-89d5-dd9c95882f3c-kube-api-access-677qw\") pod \"iptables-alerter-srwtb\" (UID: \"505f35a8-3d21-4bfe-89d5-dd9c95882f3c\") " pod="openshift-network-operator/iptables-alerter-srwtb" Apr 16 19:30:24.117223 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.117195 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr9pw\" (UniqueName: \"kubernetes.io/projected/046d5342-ca0b-4fe3-b388-6fa9f477de08-kube-api-access-dr9pw\") pod \"network-metrics-daemon-lxtld\" (UID: \"046d5342-ca0b-4fe3-b388-6fa9f477de08\") " pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:24.184166 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.184143 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vwbhq" Apr 16 19:30:24.192917 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.192895 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" Apr 16 19:30:24.201297 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.201280 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5f289" Apr 16 19:30:24.206806 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.206790 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:24.213338 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.213323 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ckws9" Apr 16 19:30:24.221386 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.221368 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7vjxb" Apr 16 19:30:24.226913 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.226898 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sslrf" Apr 16 19:30:24.232448 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.232431 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-srwtb" Apr 16 19:30:24.491374 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:24.491310 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod505f35a8_3d21_4bfe_89d5_dd9c95882f3c.slice/crio-fefc3c262e360663975ce4d74e5e091ae0cab9524b70526300391442de542c0a WatchSource:0}: Error finding container fefc3c262e360663975ce4d74e5e091ae0cab9524b70526300391442de542c0a: Status 404 returned error can't find the container with id fefc3c262e360663975ce4d74e5e091ae0cab9524b70526300391442de542c0a Apr 16 19:30:24.492680 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:24.492638 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bbc206b_84a4_45f8_9836_82284b580174.slice/crio-4fde79c76c992c3d95349a8a424732e5501b81ab786668a52b467db0467f2e64 WatchSource:0}: Error finding container 4fde79c76c992c3d95349a8a424732e5501b81ab786668a52b467db0467f2e64: Status 404 returned error can't find the container with id 4fde79c76c992c3d95349a8a424732e5501b81ab786668a52b467db0467f2e64 Apr 16 19:30:24.493910 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:24.493887 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe225946_41f7_4fe4_8421_be88c9efe965.slice/crio-81258c9449472ff8838f95da432f814cfb37ff5fb6656348ad71f821ce3f29c9 WatchSource:0}: Error finding container 81258c9449472ff8838f95da432f814cfb37ff5fb6656348ad71f821ce3f29c9: Status 404 returned error can't find the container with id 81258c9449472ff8838f95da432f814cfb37ff5fb6656348ad71f821ce3f29c9 Apr 16 19:30:24.495756 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:24.495714 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb747b1c6_2c2b_4967_8dba_6539fe9fc5d2.slice/crio-399c8b36b52907c2a51c37b537fd8c74abfd53a1631f1f15a4a1bb012cc81f44 WatchSource:0}: Error finding container 399c8b36b52907c2a51c37b537fd8c74abfd53a1631f1f15a4a1bb012cc81f44: Status 404 returned error can't find the container with id 399c8b36b52907c2a51c37b537fd8c74abfd53a1631f1f15a4a1bb012cc81f44 Apr 16 19:30:24.496559 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:24.496528 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16c561dd_93b3_4b83_9374_3a46663b8962.slice/crio-f485dfea829e7703e6305462f633c0ce1466aac23ee3fdf4711f7eecc2dff3dd WatchSource:0}: Error finding container f485dfea829e7703e6305462f633c0ce1466aac23ee3fdf4711f7eecc2dff3dd: Status 404 returned error can't find the container with id f485dfea829e7703e6305462f633c0ce1466aac23ee3fdf4711f7eecc2dff3dd Apr 16 19:30:24.497521 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:24.497495 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c96c659_d972_4967_bf3d_e50d4088b9e5.slice/crio-9a8796037653b7069617da7eb735f87094d2835ec116462e08591ebd73f5034e WatchSource:0}: Error finding container 9a8796037653b7069617da7eb735f87094d2835ec116462e08591ebd73f5034e: Status 404 returned error can't find the container with id 9a8796037653b7069617da7eb735f87094d2835ec116462e08591ebd73f5034e Apr 16 19:30:24.498347 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:24.498226 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8afc67d1_0264_4436_8d0e_ebcfa3237e7f.slice/crio-e9aa9b12b4b4a97e705bbb874512ec488622f384288cb270a0885a453f17a40d WatchSource:0}: Error finding container e9aa9b12b4b4a97e705bbb874512ec488622f384288cb270a0885a453f17a40d: Status 404 returned error can't find the container with id e9aa9b12b4b4a97e705bbb874512ec488622f384288cb270a0885a453f17a40d Apr 16 19:30:24.499554 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:24.499529 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode76a993d_4a08_49b0_a7f9_dc97575009ad.slice/crio-c78286190189193ef3d5444125472190fedb8ea367c8093d2f3222e2f99f6d41 WatchSource:0}: Error finding container c78286190189193ef3d5444125472190fedb8ea367c8093d2f3222e2f99f6d41: Status 404 returned error can't find the container with id c78286190189193ef3d5444125472190fedb8ea367c8093d2f3222e2f99f6d41 Apr 16 19:30:24.509389 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.509132 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92q6k\" (UniqueName: \"kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k\") pod \"network-check-target-brgng\" (UID: \"73ec7853-4c17-4a50-b708-9ba3534b6b45\") " pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:24.509389 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:24.509266 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:24.509389 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:24.509282 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:24.509389 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:24.509294 2575 projected.go:194] Error preparing data for projected volume kube-api-access-92q6k for pod openshift-network-diagnostics/network-check-target-brgng: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:24.509389 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:24.509362 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k podName:73ec7853-4c17-4a50-b708-9ba3534b6b45 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:25.5093441 +0000 UTC m=+4.044358543 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-92q6k" (UniqueName: "kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k") pod "network-check-target-brgng" (UID: "73ec7853-4c17-4a50-b708-9ba3534b6b45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:24.609820 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.609704 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs\") pod \"network-metrics-daemon-lxtld\" (UID: \"046d5342-ca0b-4fe3-b388-6fa9f477de08\") " pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:24.609924 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:24.609852 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:24.609924 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:24.609920 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs podName:046d5342-ca0b-4fe3-b388-6fa9f477de08 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:25.609897115 +0000 UTC m=+4.144911555 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs") pod "network-metrics-daemon-lxtld" (UID: "046d5342-ca0b-4fe3-b388-6fa9f477de08") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:24.925360 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.925211 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:25:22 +0000 UTC" deadline="2027-11-13 07:34:20.883419847 +0000 UTC" Apr 16 19:30:24.925360 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:24.925244 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13812h3m55.958179102s" Apr 16 19:30:25.015098 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:25.014986 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:25.015698 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:25.015346 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-brgng" podUID="73ec7853-4c17-4a50-b708-9ba3534b6b45" Apr 16 19:30:25.025689 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:25.025628 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5f289" event={"ID":"e76a993d-4a08-49b0-a7f9-dc97575009ad","Type":"ContainerStarted","Data":"c78286190189193ef3d5444125472190fedb8ea367c8093d2f3222e2f99f6d41"} Apr 16 19:30:25.029245 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:25.029201 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sslrf" event={"ID":"0c96c659-d972-4967-bf3d-e50d4088b9e5","Type":"ContainerStarted","Data":"9a8796037653b7069617da7eb735f87094d2835ec116462e08591ebd73f5034e"} Apr 16 19:30:25.033728 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:25.033688 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vwbhq" event={"ID":"be225946-41f7-4fe4-8421-be88c9efe965","Type":"ContainerStarted","Data":"81258c9449472ff8838f95da432f814cfb37ff5fb6656348ad71f821ce3f29c9"} Apr 16 19:30:25.040063 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:25.039998 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7vjxb" event={"ID":"3bbc206b-84a4-45f8-9836-82284b580174","Type":"ContainerStarted","Data":"4fde79c76c992c3d95349a8a424732e5501b81ab786668a52b467db0467f2e64"} Apr 16 19:30:25.050657 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:25.050630 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-241.ec2.internal" event={"ID":"8012d096d8d919883fff307930899c9a","Type":"ContainerStarted","Data":"80437729a3fc3429de7d04cb9e34219dff7ad0f0929ffac82f180fb1f3fbe602"} Apr 16 19:30:25.060593 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:25.059689 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ckws9" event={"ID":"8afc67d1-0264-4436-8d0e-ebcfa3237e7f","Type":"ContainerStarted","Data":"e9aa9b12b4b4a97e705bbb874512ec488622f384288cb270a0885a453f17a40d"} Apr 16 19:30:25.061267 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:25.061160 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" event={"ID":"16c561dd-93b3-4b83-9374-3a46663b8962","Type":"ContainerStarted","Data":"f485dfea829e7703e6305462f633c0ce1466aac23ee3fdf4711f7eecc2dff3dd"} Apr 16 19:30:25.069395 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:25.069367 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" event={"ID":"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2","Type":"ContainerStarted","Data":"399c8b36b52907c2a51c37b537fd8c74abfd53a1631f1f15a4a1bb012cc81f44"} Apr 16 19:30:25.073770 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:25.073731 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-srwtb" event={"ID":"505f35a8-3d21-4bfe-89d5-dd9c95882f3c","Type":"ContainerStarted","Data":"fefc3c262e360663975ce4d74e5e091ae0cab9524b70526300391442de542c0a"} Apr 16 19:30:25.404862 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:25.404224 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:30:25.516999 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:25.516322 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92q6k\" (UniqueName: \"kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k\") pod \"network-check-target-brgng\" (UID: \"73ec7853-4c17-4a50-b708-9ba3534b6b45\") " pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:25.516999 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:25.516489 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:25.516999 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:25.516510 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:25.516999 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:25.516523 2575 projected.go:194] Error preparing data for projected volume kube-api-access-92q6k for pod openshift-network-diagnostics/network-check-target-brgng: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:25.516999 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:25.516581 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k podName:73ec7853-4c17-4a50-b708-9ba3534b6b45 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:27.516563953 +0000 UTC m=+6.051578381 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-92q6k" (UniqueName: "kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k") pod "network-check-target-brgng" (UID: "73ec7853-4c17-4a50-b708-9ba3534b6b45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:25.617172 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:25.617133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs\") pod \"network-metrics-daemon-lxtld\" (UID: \"046d5342-ca0b-4fe3-b388-6fa9f477de08\") " pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:25.617345 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:25.617285 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:25.617400 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:25.617349 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs podName:046d5342-ca0b-4fe3-b388-6fa9f477de08 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:27.617330691 +0000 UTC m=+6.152345130 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs") pod "network-metrics-daemon-lxtld" (UID: "046d5342-ca0b-4fe3-b388-6fa9f477de08") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:26.016914 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:26.016836 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:26.017340 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:26.016970 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lxtld" podUID="046d5342-ca0b-4fe3-b388-6fa9f477de08" Apr 16 19:30:26.093161 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:26.093123 2575 generic.go:358] "Generic (PLEG): container finished" podID="76b29a18b5fb21662a973de3f8f2bd10" containerID="455e27106860493e129e94d07ba07e536f55ec2f88f16f733a9f2744b25f2c73" exitCode=0 Apr 16 19:30:26.093327 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:26.093235 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal" event={"ID":"76b29a18b5fb21662a973de3f8f2bd10","Type":"ContainerDied","Data":"455e27106860493e129e94d07ba07e536f55ec2f88f16f733a9f2744b25f2c73"} Apr 16 19:30:26.107440 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:26.107390 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-241.ec2.internal" podStartSLOduration=3.107373733 podStartE2EDuration="3.107373733s" podCreationTimestamp="2026-04-16 19:30:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:30:25.06556493 +0000 UTC m=+3.600579380" watchObservedRunningTime="2026-04-16 19:30:26.107373733 +0000 UTC m=+4.642388177" Apr 16 19:30:27.014913 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:27.014883 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:27.015114 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:27.015004 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-brgng" podUID="73ec7853-4c17-4a50-b708-9ba3534b6b45" Apr 16 19:30:27.098951 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:27.098914 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal" event={"ID":"76b29a18b5fb21662a973de3f8f2bd10","Type":"ContainerStarted","Data":"b7475ca33ea928d051f1f1b86692dd16ec31560e315ddcfac618f4f809e12fe0"} Apr 16 19:30:27.112831 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:27.112785 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-241.ec2.internal" podStartSLOduration=4.112766851 podStartE2EDuration="4.112766851s" podCreationTimestamp="2026-04-16 19:30:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:30:27.112405751 +0000 UTC m=+5.647420200" watchObservedRunningTime="2026-04-16 19:30:27.112766851 +0000 UTC m=+5.647781301" Apr 16 19:30:27.533185 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:27.533066 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92q6k\" (UniqueName: \"kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k\") pod \"network-check-target-brgng\" (UID: \"73ec7853-4c17-4a50-b708-9ba3534b6b45\") " pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:27.533372 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:27.533191 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:27.533372 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:27.533226 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:27.533372 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:27.533239 2575 projected.go:194] Error preparing data for projected volume kube-api-access-92q6k for pod openshift-network-diagnostics/network-check-target-brgng: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:27.533372 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:27.533306 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k podName:73ec7853-4c17-4a50-b708-9ba3534b6b45 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:31.533287096 +0000 UTC m=+10.068301540 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-92q6k" (UniqueName: "kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k") pod "network-check-target-brgng" (UID: "73ec7853-4c17-4a50-b708-9ba3534b6b45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:27.633698 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:27.633662 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs\") pod \"network-metrics-daemon-lxtld\" (UID: \"046d5342-ca0b-4fe3-b388-6fa9f477de08\") " pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:27.634299 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:27.633858 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:27.634299 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:27.633936 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs podName:046d5342-ca0b-4fe3-b388-6fa9f477de08 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:31.633918432 +0000 UTC m=+10.168932870 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs") pod "network-metrics-daemon-lxtld" (UID: "046d5342-ca0b-4fe3-b388-6fa9f477de08") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:28.015209 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:28.014908 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:28.015209 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:28.015015 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lxtld" podUID="046d5342-ca0b-4fe3-b388-6fa9f477de08" Apr 16 19:30:29.014446 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:29.014395 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:29.014884 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:29.014533 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-brgng" podUID="73ec7853-4c17-4a50-b708-9ba3534b6b45" Apr 16 19:30:30.018039 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:30.017420 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:30.018039 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:30.017537 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lxtld" podUID="046d5342-ca0b-4fe3-b388-6fa9f477de08" Apr 16 19:30:31.015129 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:31.014900 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:31.015129 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:31.015034 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-brgng" podUID="73ec7853-4c17-4a50-b708-9ba3534b6b45" Apr 16 19:30:31.561873 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:31.561834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92q6k\" (UniqueName: \"kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k\") pod \"network-check-target-brgng\" (UID: \"73ec7853-4c17-4a50-b708-9ba3534b6b45\") " pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:31.562329 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:31.562026 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:31.562329 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:31.562070 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:31.562329 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:31.562086 2575 projected.go:194] Error preparing data for projected volume kube-api-access-92q6k for pod openshift-network-diagnostics/network-check-target-brgng: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:31.562329 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:31.562151 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k podName:73ec7853-4c17-4a50-b708-9ba3534b6b45 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:39.562132223 +0000 UTC m=+18.097146662 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-92q6k" (UniqueName: "kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k") pod "network-check-target-brgng" (UID: "73ec7853-4c17-4a50-b708-9ba3534b6b45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:31.662795 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:31.662759 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs\") pod \"network-metrics-daemon-lxtld\" (UID: \"046d5342-ca0b-4fe3-b388-6fa9f477de08\") " pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:31.662976 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:31.662934 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:31.663041 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:31.662992 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs podName:046d5342-ca0b-4fe3-b388-6fa9f477de08 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:39.662974367 +0000 UTC m=+18.197988800 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs") pod "network-metrics-daemon-lxtld" (UID: "046d5342-ca0b-4fe3-b388-6fa9f477de08") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:32.019077 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:32.018594 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:32.019077 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:32.018721 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lxtld" podUID="046d5342-ca0b-4fe3-b388-6fa9f477de08" Apr 16 19:30:33.014668 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:33.014637 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:33.015115 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:33.014756 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-brgng" podUID="73ec7853-4c17-4a50-b708-9ba3534b6b45" Apr 16 19:30:34.014402 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:34.014365 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:34.014598 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:34.014513 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lxtld" podUID="046d5342-ca0b-4fe3-b388-6fa9f477de08" Apr 16 19:30:35.014730 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:35.014701 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:35.015132 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:35.014808 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-brgng" podUID="73ec7853-4c17-4a50-b708-9ba3534b6b45" Apr 16 19:30:36.015062 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:36.015014 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:36.015510 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:36.015154 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lxtld" podUID="046d5342-ca0b-4fe3-b388-6fa9f477de08" Apr 16 19:30:37.014982 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:37.014900 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:37.015177 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:37.015022 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-brgng" podUID="73ec7853-4c17-4a50-b708-9ba3534b6b45" Apr 16 19:30:38.014774 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:38.014733 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:38.014956 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:38.014913 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lxtld" podUID="046d5342-ca0b-4fe3-b388-6fa9f477de08" Apr 16 19:30:38.563003 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:38.562975 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-9dg6k"] Apr 16 19:30:38.575707 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:38.575686 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9dg6k" Apr 16 19:30:38.578313 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:38.578282 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 19:30:38.578313 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:38.578301 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-fr6fn\"" Apr 16 19:30:38.579252 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:38.579226 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 19:30:38.715226 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:38.715199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnb5h\" (UniqueName: \"kubernetes.io/projected/ca3720a8-e72a-4e65-a1f7-4270435ae4e1-kube-api-access-rnb5h\") pod \"node-resolver-9dg6k\" (UID: \"ca3720a8-e72a-4e65-a1f7-4270435ae4e1\") " pod="openshift-dns/node-resolver-9dg6k" Apr 16 19:30:38.715407 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:38.715242 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ca3720a8-e72a-4e65-a1f7-4270435ae4e1-hosts-file\") pod \"node-resolver-9dg6k\" (UID: \"ca3720a8-e72a-4e65-a1f7-4270435ae4e1\") " pod="openshift-dns/node-resolver-9dg6k" Apr 16 19:30:38.715407 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:38.715290 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ca3720a8-e72a-4e65-a1f7-4270435ae4e1-tmp-dir\") pod \"node-resolver-9dg6k\" (UID: \"ca3720a8-e72a-4e65-a1f7-4270435ae4e1\") " pod="openshift-dns/node-resolver-9dg6k" Apr 16 19:30:38.816223 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:38.816147 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnb5h\" (UniqueName: \"kubernetes.io/projected/ca3720a8-e72a-4e65-a1f7-4270435ae4e1-kube-api-access-rnb5h\") pod \"node-resolver-9dg6k\" (UID: \"ca3720a8-e72a-4e65-a1f7-4270435ae4e1\") " pod="openshift-dns/node-resolver-9dg6k" Apr 16 19:30:38.816223 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:38.816182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ca3720a8-e72a-4e65-a1f7-4270435ae4e1-hosts-file\") pod \"node-resolver-9dg6k\" (UID: \"ca3720a8-e72a-4e65-a1f7-4270435ae4e1\") " pod="openshift-dns/node-resolver-9dg6k" Apr 16 19:30:38.816430 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:38.816228 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ca3720a8-e72a-4e65-a1f7-4270435ae4e1-tmp-dir\") pod \"node-resolver-9dg6k\" (UID: \"ca3720a8-e72a-4e65-a1f7-4270435ae4e1\") " pod="openshift-dns/node-resolver-9dg6k" Apr 16 19:30:38.816430 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:38.816335 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ca3720a8-e72a-4e65-a1f7-4270435ae4e1-hosts-file\") pod \"node-resolver-9dg6k\" (UID: \"ca3720a8-e72a-4e65-a1f7-4270435ae4e1\") " pod="openshift-dns/node-resolver-9dg6k" Apr 16 19:30:38.825719 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:38.825666 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ca3720a8-e72a-4e65-a1f7-4270435ae4e1-tmp-dir\") pod \"node-resolver-9dg6k\" (UID: \"ca3720a8-e72a-4e65-a1f7-4270435ae4e1\") " pod="openshift-dns/node-resolver-9dg6k" Apr 16 19:30:38.825998 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:38.825980 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnb5h\" (UniqueName: \"kubernetes.io/projected/ca3720a8-e72a-4e65-a1f7-4270435ae4e1-kube-api-access-rnb5h\") pod \"node-resolver-9dg6k\" (UID: \"ca3720a8-e72a-4e65-a1f7-4270435ae4e1\") " pod="openshift-dns/node-resolver-9dg6k" Apr 16 19:30:38.885567 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:38.885539 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9dg6k" Apr 16 19:30:39.014591 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:39.014557 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:39.014745 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:39.014677 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-brgng" podUID="73ec7853-4c17-4a50-b708-9ba3534b6b45" Apr 16 19:30:39.622092 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:39.622046 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92q6k\" (UniqueName: \"kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k\") pod \"network-check-target-brgng\" (UID: \"73ec7853-4c17-4a50-b708-9ba3534b6b45\") " pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:39.622492 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:39.622220 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:39.622492 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:39.622241 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:39.622492 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:39.622251 2575 projected.go:194] Error preparing data for projected volume kube-api-access-92q6k for pod openshift-network-diagnostics/network-check-target-brgng: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:39.622492 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:39.622308 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k podName:73ec7853-4c17-4a50-b708-9ba3534b6b45 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:55.622289989 +0000 UTC m=+34.157304432 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-92q6k" (UniqueName: "kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k") pod "network-check-target-brgng" (UID: "73ec7853-4c17-4a50-b708-9ba3534b6b45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:39.723016 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:39.722984 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs\") pod \"network-metrics-daemon-lxtld\" (UID: \"046d5342-ca0b-4fe3-b388-6fa9f477de08\") " pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:39.723203 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:39.723155 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:39.723261 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:39.723227 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs podName:046d5342-ca0b-4fe3-b388-6fa9f477de08 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:55.723207414 +0000 UTC m=+34.258221861 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs") pod "network-metrics-daemon-lxtld" (UID: "046d5342-ca0b-4fe3-b388-6fa9f477de08") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:40.015191 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:40.015110 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:40.015337 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:40.015250 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lxtld" podUID="046d5342-ca0b-4fe3-b388-6fa9f477de08" Apr 16 19:30:41.015028 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:41.014996 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:41.015421 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:41.015106 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-brgng" podUID="73ec7853-4c17-4a50-b708-9ba3534b6b45" Apr 16 19:30:41.269632 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:41.269596 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca3720a8_e72a_4e65_a1f7_4270435ae4e1.slice/crio-80c8d61368eadbf96a186aef2ec9d5b4c10e3274dc9279267ce41c155b70dcc8 WatchSource:0}: Error finding container 80c8d61368eadbf96a186aef2ec9d5b4c10e3274dc9279267ce41c155b70dcc8: Status 404 returned error can't find the container with id 80c8d61368eadbf96a186aef2ec9d5b4c10e3274dc9279267ce41c155b70dcc8 Apr 16 19:30:42.014872 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.014698 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:42.015034 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:42.014959 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lxtld" podUID="046d5342-ca0b-4fe3-b388-6fa9f477de08" Apr 16 19:30:42.124337 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.124265 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ckws9" event={"ID":"8afc67d1-0264-4436-8d0e-ebcfa3237e7f","Type":"ContainerStarted","Data":"47c9c4e2ebdd5f95af51f067b08182a591e262aafa5fd39a1353de3ba4f449b1"} Apr 16 19:30:42.126772 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.126747 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" event={"ID":"16c561dd-93b3-4b83-9374-3a46663b8962","Type":"ContainerStarted","Data":"5f220700aca5547ca1f0543d79a9514f8655b502420993c7c13c474b78e72c97"} Apr 16 19:30:42.126772 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.126774 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" event={"ID":"16c561dd-93b3-4b83-9374-3a46663b8962","Type":"ContainerStarted","Data":"9c0427891e6d8a7feddf11971e77a561eaab4f1ecb8c433f8fd8d758a85ab239"} Apr 16 19:30:42.126923 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.126783 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" event={"ID":"16c561dd-93b3-4b83-9374-3a46663b8962","Type":"ContainerStarted","Data":"a8a3f87bca32f3b106e71eff8275e2474a582dcf47881bf5325e1db15b22dee7"} Apr 16 19:30:42.126923 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.126792 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" event={"ID":"16c561dd-93b3-4b83-9374-3a46663b8962","Type":"ContainerStarted","Data":"7cdfae79f49fecdf2cfd2f0abc7f49e3d6274e3832a066e2c2e741ed3eddf906"} Apr 16 19:30:42.126923 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.126804 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" event={"ID":"16c561dd-93b3-4b83-9374-3a46663b8962","Type":"ContainerStarted","Data":"b85bdd348572eb56ce6d189b8149502f6ac7c277322126fc08ee302ae463c17d"} Apr 16 19:30:42.126923 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.126815 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" event={"ID":"16c561dd-93b3-4b83-9374-3a46663b8962","Type":"ContainerStarted","Data":"a68c2db278cca44c0589ccaa8e4747b3e9381cf535e3bcf02b7d3752feaeba02"} Apr 16 19:30:42.127970 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.127951 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" event={"ID":"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2","Type":"ContainerStarted","Data":"ee5f3bdad30c08197e6109a3a4f6160c84d507663c08d9a5e6cbabaf6f2b47f8"} Apr 16 19:30:42.129282 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.129259 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5f289" event={"ID":"e76a993d-4a08-49b0-a7f9-dc97575009ad","Type":"ContainerStarted","Data":"2b930d9413cae5fa82077bf88d64ab5d39742ee81b210f7ffff01b955bb2ae63"} Apr 16 19:30:42.130675 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.130652 2575 generic.go:358] "Generic (PLEG): container finished" podID="0c96c659-d972-4967-bf3d-e50d4088b9e5" containerID="108e08179693eaf6deedbc774998124c40db002a292fdae487b9986c8bbd3f9a" exitCode=0 Apr 16 19:30:42.130780 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.130733 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sslrf" event={"ID":"0c96c659-d972-4967-bf3d-e50d4088b9e5","Type":"ContainerDied","Data":"108e08179693eaf6deedbc774998124c40db002a292fdae487b9986c8bbd3f9a"} Apr 16 19:30:42.132084 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.132061 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vwbhq" event={"ID":"be225946-41f7-4fe4-8421-be88c9efe965","Type":"ContainerStarted","Data":"dd529b997035e84824bc42f54504045cc305da51a413dc4d90d5e025c5aa1b63"} Apr 16 19:30:42.133395 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.133377 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7vjxb" event={"ID":"3bbc206b-84a4-45f8-9836-82284b580174","Type":"ContainerStarted","Data":"2a376411d69731021cd07b1764b72067e3fb586c9f7de69badc0a6843d284f2a"} Apr 16 19:30:42.134546 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.134531 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9dg6k" event={"ID":"ca3720a8-e72a-4e65-a1f7-4270435ae4e1","Type":"ContainerStarted","Data":"199d8c7631ae0d81ba81654b8f5b9513bdf03c6e50173f3ae1d15b7c4c8b71f2"} Apr 16 19:30:42.134626 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.134550 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9dg6k" event={"ID":"ca3720a8-e72a-4e65-a1f7-4270435ae4e1","Type":"ContainerStarted","Data":"80c8d61368eadbf96a186aef2ec9d5b4c10e3274dc9279267ce41c155b70dcc8"} Apr 16 19:30:42.144521 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.144486 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ckws9" podStartSLOduration=3.378628815 podStartE2EDuration="20.144476145s" podCreationTimestamp="2026-04-16 19:30:22 +0000 UTC" firstStartedPulling="2026-04-16 19:30:24.500439562 +0000 UTC m=+3.035454001" lastFinishedPulling="2026-04-16 19:30:41.266286895 +0000 UTC m=+19.801301331" observedRunningTime="2026-04-16 19:30:42.144347532 +0000 UTC m=+20.679361980" watchObservedRunningTime="2026-04-16 19:30:42.144476145 +0000 UTC m=+20.679490593" Apr 16 19:30:42.182964 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.182931 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9dg6k" podStartSLOduration=4.182920936 podStartE2EDuration="4.182920936s" podCreationTimestamp="2026-04-16 19:30:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:30:42.182629323 +0000 UTC m=+20.717643773" watchObservedRunningTime="2026-04-16 19:30:42.182920936 +0000 UTC m=+20.717935383" Apr 16 19:30:42.198775 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.198743 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vwbhq" podStartSLOduration=3.458649983 podStartE2EDuration="20.198732956s" podCreationTimestamp="2026-04-16 19:30:22 +0000 UTC" firstStartedPulling="2026-04-16 19:30:24.495405415 +0000 UTC m=+3.030419851" lastFinishedPulling="2026-04-16 19:30:41.235488383 +0000 UTC m=+19.770502824" observedRunningTime="2026-04-16 19:30:42.198283799 +0000 UTC m=+20.733298248" watchObservedRunningTime="2026-04-16 19:30:42.198732956 +0000 UTC m=+20.733747403" Apr 16 19:30:42.214355 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.214327 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5f289" podStartSLOduration=8.169328708 podStartE2EDuration="20.214320293s" podCreationTimestamp="2026-04-16 19:30:22 +0000 UTC" firstStartedPulling="2026-04-16 19:30:24.501398313 +0000 UTC m=+3.036412738" lastFinishedPulling="2026-04-16 19:30:36.546389884 +0000 UTC m=+15.081404323" observedRunningTime="2026-04-16 19:30:42.214269476 +0000 UTC m=+20.749283921" watchObservedRunningTime="2026-04-16 19:30:42.214320293 +0000 UTC m=+20.749334773" Apr 16 19:30:42.237129 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:42.237087 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7vjxb" podStartSLOduration=3.46582566 podStartE2EDuration="20.237078707s" podCreationTimestamp="2026-04-16 19:30:22 +0000 UTC" firstStartedPulling="2026-04-16 19:30:24.495208637 +0000 UTC m=+3.030223062" lastFinishedPulling="2026-04-16 19:30:41.266461683 +0000 UTC m=+19.801476109" observedRunningTime="2026-04-16 19:30:42.23668546 +0000 UTC m=+20.771699907" watchObservedRunningTime="2026-04-16 19:30:42.237078707 +0000 UTC m=+20.772093154" Apr 16 19:30:43.015035 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:43.015010 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:43.015175 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:43.015154 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-brgng" podUID="73ec7853-4c17-4a50-b708-9ba3534b6b45" Apr 16 19:30:43.048430 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:43.048405 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 19:30:43.138159 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:43.138093 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" event={"ID":"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2","Type":"ContainerStarted","Data":"d5401b38e1a60553ec5a3f48d596379224ea03559af5eaa9bca61a5777bffb84"} Apr 16 19:30:43.139712 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:43.139682 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-srwtb" event={"ID":"505f35a8-3d21-4bfe-89d5-dd9c95882f3c","Type":"ContainerStarted","Data":"8c9fcb093152f25bf1fe5c45cd22f2689b3be05ddd0870b2bf5bf78038c4ebae"} Apr 16 19:30:43.157964 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:43.156840 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-srwtb" podStartSLOduration=4.41428469 podStartE2EDuration="21.156818223s" podCreationTimestamp="2026-04-16 19:30:22 +0000 UTC" firstStartedPulling="2026-04-16 19:30:24.492928484 +0000 UTC m=+3.027942925" lastFinishedPulling="2026-04-16 19:30:41.235462015 +0000 UTC m=+19.770476458" observedRunningTime="2026-04-16 19:30:43.15576295 +0000 UTC m=+21.690777399" watchObservedRunningTime="2026-04-16 19:30:43.156818223 +0000 UTC m=+21.691832678" Apr 16 19:30:43.294856 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:43.294832 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vwbhq" Apr 16 19:30:43.295504 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:43.295484 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vwbhq" Apr 16 19:30:43.954450 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:43.954326 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T19:30:43.048425227Z","UUID":"5a4fcd18-5e40-4550-9bd4-b35cc549cc34","Handler":null,"Name":"","Endpoint":""} Apr 16 19:30:43.956494 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:43.956330 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 19:30:43.956630 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:43.956502 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 19:30:44.014659 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:44.014629 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:44.014839 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:44.014810 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lxtld" podUID="046d5342-ca0b-4fe3-b388-6fa9f477de08" Apr 16 19:30:44.145463 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:44.145424 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" event={"ID":"16c561dd-93b3-4b83-9374-3a46663b8962","Type":"ContainerStarted","Data":"5cfd8594737466372694dfdbe4de0c26fb7184a9c84ffd9694b0ab9c586923fe"} Apr 16 19:30:44.145861 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:44.145819 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vwbhq" Apr 16 19:30:44.146089 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:44.146065 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vwbhq" Apr 16 19:30:45.014839 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:45.014761 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:45.014997 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:45.014885 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-brgng" podUID="73ec7853-4c17-4a50-b708-9ba3534b6b45" Apr 16 19:30:45.149235 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:45.149203 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" event={"ID":"b747b1c6-2c2b-4967-8dba-6539fe9fc5d2","Type":"ContainerStarted","Data":"c9a6a7d94a6ea2b6c7700cffbabf8995c32c720eb52558aa3cb104d647a73232"} Apr 16 19:30:45.169067 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:45.169005 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v6bjh" podStartSLOduration=2.994738729 podStartE2EDuration="23.168991351s" podCreationTimestamp="2026-04-16 19:30:22 +0000 UTC" firstStartedPulling="2026-04-16 19:30:24.497876645 +0000 UTC m=+3.032891070" lastFinishedPulling="2026-04-16 19:30:44.672129243 +0000 UTC m=+23.207143692" observedRunningTime="2026-04-16 19:30:45.16839754 +0000 UTC m=+23.703412026" watchObservedRunningTime="2026-04-16 19:30:45.168991351 +0000 UTC m=+23.704005799" Apr 16 19:30:46.014295 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:46.014264 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:46.014470 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:46.014391 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lxtld" podUID="046d5342-ca0b-4fe3-b388-6fa9f477de08" Apr 16 19:30:47.014844 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:47.014639 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:47.015493 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:47.014876 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-brgng" podUID="73ec7853-4c17-4a50-b708-9ba3534b6b45" Apr 16 19:30:47.156715 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:47.156679 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" event={"ID":"16c561dd-93b3-4b83-9374-3a46663b8962","Type":"ContainerStarted","Data":"151dc88ec8cbce606dc44074c346ff1b37d24ad1311c0f0af722df929aac9851"} Apr 16 19:30:47.156952 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:47.156921 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:47.158390 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:47.158363 2575 generic.go:358] "Generic (PLEG): container finished" podID="0c96c659-d972-4967-bf3d-e50d4088b9e5" containerID="51ae9f4b44486e4dc094272877581f9b95958de775af0ed71c1bff5a04e39121" exitCode=0 Apr 16 19:30:47.158530 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:47.158410 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sslrf" event={"ID":"0c96c659-d972-4967-bf3d-e50d4088b9e5","Type":"ContainerDied","Data":"51ae9f4b44486e4dc094272877581f9b95958de775af0ed71c1bff5a04e39121"} Apr 16 19:30:47.171631 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:47.171612 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:47.187559 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:47.187525 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" podStartSLOduration=8.188270805 podStartE2EDuration="25.18751362s" podCreationTimestamp="2026-04-16 19:30:22 +0000 UTC" firstStartedPulling="2026-04-16 19:30:24.498601634 +0000 UTC m=+3.033616064" lastFinishedPulling="2026-04-16 19:30:41.497844451 +0000 UTC m=+20.032858879" observedRunningTime="2026-04-16 19:30:47.185856418 +0000 UTC m=+25.720870866" watchObservedRunningTime="2026-04-16 19:30:47.18751362 +0000 UTC m=+25.722528097" Apr 16 19:30:48.014612 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:48.014582 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:48.014736 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:48.014688 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lxtld" podUID="046d5342-ca0b-4fe3-b388-6fa9f477de08" Apr 16 19:30:48.162685 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:48.162602 2575 generic.go:358] "Generic (PLEG): container finished" podID="0c96c659-d972-4967-bf3d-e50d4088b9e5" containerID="9d0dba5f0889dbc61763f7a198057a94ed955b66245a3c74351c820b28834e31" exitCode=0 Apr 16 19:30:48.163135 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:48.162684 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sslrf" event={"ID":"0c96c659-d972-4967-bf3d-e50d4088b9e5","Type":"ContainerDied","Data":"9d0dba5f0889dbc61763f7a198057a94ed955b66245a3c74351c820b28834e31"} Apr 16 19:30:48.163386 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:48.163359 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:48.163521 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:48.163393 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:48.177817 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:48.177799 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:30:48.538106 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:48.537787 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-brgng"] Apr 16 19:30:48.538291 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:48.538152 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:48.538291 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:48.538257 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-brgng" podUID="73ec7853-4c17-4a50-b708-9ba3534b6b45" Apr 16 19:30:48.540178 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:48.540150 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lxtld"] Apr 16 19:30:48.540362 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:48.540248 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:48.540415 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:48.540354 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lxtld" podUID="046d5342-ca0b-4fe3-b388-6fa9f477de08" Apr 16 19:30:49.166251 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:49.166165 2575 generic.go:358] "Generic (PLEG): container finished" podID="0c96c659-d972-4967-bf3d-e50d4088b9e5" containerID="a514cdeb32d68c415162cf3327c7fdc1c7871795409bf70323fe7489ec501af8" exitCode=0 Apr 16 19:30:49.166654 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:49.166238 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sslrf" event={"ID":"0c96c659-d972-4967-bf3d-e50d4088b9e5","Type":"ContainerDied","Data":"a514cdeb32d68c415162cf3327c7fdc1c7871795409bf70323fe7489ec501af8"} Apr 16 19:30:50.015036 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:50.014998 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:50.015036 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:50.015001 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:50.015282 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:50.015152 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lxtld" podUID="046d5342-ca0b-4fe3-b388-6fa9f477de08" Apr 16 19:30:50.015282 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:50.015222 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-brgng" podUID="73ec7853-4c17-4a50-b708-9ba3534b6b45" Apr 16 19:30:52.015872 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:52.015840 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:52.016520 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:52.015922 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:52.016520 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:52.016033 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-brgng" podUID="73ec7853-4c17-4a50-b708-9ba3534b6b45" Apr 16 19:30:52.016520 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:52.016135 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lxtld" podUID="046d5342-ca0b-4fe3-b388-6fa9f477de08" Apr 16 19:30:54.014425 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.014395 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:54.014901 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.014394 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:54.014901 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:54.014528 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-brgng" podUID="73ec7853-4c17-4a50-b708-9ba3534b6b45" Apr 16 19:30:54.014901 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:54.014584 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lxtld" podUID="046d5342-ca0b-4fe3-b388-6fa9f477de08" Apr 16 19:30:54.323608 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.323528 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-241.ec2.internal" event="NodeReady" Apr 16 19:30:54.323757 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.323678 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 19:30:54.367497 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.367464 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tv9rh"] Apr 16 19:30:54.411652 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.411626 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-q8rl9"] Apr 16 19:30:54.411820 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.411801 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tv9rh" Apr 16 19:30:54.414325 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.414221 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fkm8b\"" Apr 16 19:30:54.414325 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.414246 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 19:30:54.414325 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.414253 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 19:30:54.426872 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.426852 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tv9rh"] Apr 16 19:30:54.427176 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.427148 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q8rl9" Apr 16 19:30:54.427753 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.426882 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q8rl9"] Apr 16 19:30:54.429942 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.429922 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 19:30:54.430074 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.430042 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fqt5t\"" Apr 16 19:30:54.431292 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.431273 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 19:30:54.431548 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.431493 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 19:30:54.538915 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.538885 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-metrics-tls\") pod \"dns-default-tv9rh\" (UID: \"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7\") " pod="openshift-dns/dns-default-tv9rh" Apr 16 19:30:54.539096 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.538936 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpqfr\" (UniqueName: \"kubernetes.io/projected/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-kube-api-access-bpqfr\") pod \"ingress-canary-q8rl9\" (UID: \"5ba63d7e-2cb5-4b7c-8bbd-b135de519a76\") " pod="openshift-ingress-canary/ingress-canary-q8rl9" Apr 16 19:30:54.539096 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.538985 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert\") pod \"ingress-canary-q8rl9\" (UID: \"5ba63d7e-2cb5-4b7c-8bbd-b135de519a76\") " pod="openshift-ingress-canary/ingress-canary-q8rl9" Apr 16 19:30:54.539096 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.539087 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-tmp-dir\") pod \"dns-default-tv9rh\" (UID: \"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7\") " pod="openshift-dns/dns-default-tv9rh" Apr 16 19:30:54.539249 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.539120 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k82kg\" (UniqueName: \"kubernetes.io/projected/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-kube-api-access-k82kg\") pod \"dns-default-tv9rh\" (UID: \"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7\") " pod="openshift-dns/dns-default-tv9rh" Apr 16 19:30:54.539249 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.539227 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-config-volume\") pod \"dns-default-tv9rh\" (UID: \"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7\") " pod="openshift-dns/dns-default-tv9rh" Apr 16 19:30:54.640258 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.640190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-config-volume\") pod \"dns-default-tv9rh\" (UID: \"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7\") " pod="openshift-dns/dns-default-tv9rh" Apr 16 19:30:54.640258 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.640229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-metrics-tls\") pod \"dns-default-tv9rh\" (UID: \"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7\") " pod="openshift-dns/dns-default-tv9rh" Apr 16 19:30:54.640454 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.640271 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpqfr\" (UniqueName: \"kubernetes.io/projected/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-kube-api-access-bpqfr\") pod \"ingress-canary-q8rl9\" (UID: \"5ba63d7e-2cb5-4b7c-8bbd-b135de519a76\") " pod="openshift-ingress-canary/ingress-canary-q8rl9" Apr 16 19:30:54.640454 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:54.640392 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:30:54.640540 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.640494 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert\") pod \"ingress-canary-q8rl9\" (UID: \"5ba63d7e-2cb5-4b7c-8bbd-b135de519a76\") " pod="openshift-ingress-canary/ingress-canary-q8rl9" Apr 16 19:30:54.640588 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:54.640549 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-metrics-tls podName:39bb06a3-3f67-42dd-9b2d-1fef39ab08c7 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:55.140524957 +0000 UTC m=+33.675539397 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-metrics-tls") pod "dns-default-tv9rh" (UID: "39bb06a3-3f67-42dd-9b2d-1fef39ab08c7") : secret "dns-default-metrics-tls" not found Apr 16 19:30:54.640588 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.640580 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-tmp-dir\") pod \"dns-default-tv9rh\" (UID: \"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7\") " pod="openshift-dns/dns-default-tv9rh" Apr 16 19:30:54.640690 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:54.640603 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:30:54.640690 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.640610 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k82kg\" (UniqueName: \"kubernetes.io/projected/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-kube-api-access-k82kg\") pod \"dns-default-tv9rh\" (UID: \"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7\") " pod="openshift-dns/dns-default-tv9rh" Apr 16 19:30:54.640690 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:54.640690 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert podName:5ba63d7e-2cb5-4b7c-8bbd-b135de519a76 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:55.140675748 +0000 UTC m=+33.675690187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert") pod "ingress-canary-q8rl9" (UID: "5ba63d7e-2cb5-4b7c-8bbd-b135de519a76") : secret "canary-serving-cert" not found Apr 16 19:30:54.640944 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.640913 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-tmp-dir\") pod \"dns-default-tv9rh\" (UID: \"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7\") " pod="openshift-dns/dns-default-tv9rh" Apr 16 19:30:54.651102 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.651075 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-config-volume\") pod \"dns-default-tv9rh\" (UID: \"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7\") " pod="openshift-dns/dns-default-tv9rh" Apr 16 19:30:54.653402 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.653384 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k82kg\" (UniqueName: \"kubernetes.io/projected/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-kube-api-access-k82kg\") pod \"dns-default-tv9rh\" (UID: \"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7\") " pod="openshift-dns/dns-default-tv9rh" Apr 16 19:30:54.653606 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:54.653585 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpqfr\" (UniqueName: \"kubernetes.io/projected/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-kube-api-access-bpqfr\") pod \"ingress-canary-q8rl9\" (UID: \"5ba63d7e-2cb5-4b7c-8bbd-b135de519a76\") " pod="openshift-ingress-canary/ingress-canary-q8rl9" Apr 16 19:30:55.144422 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:55.144227 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert\") pod \"ingress-canary-q8rl9\" (UID: \"5ba63d7e-2cb5-4b7c-8bbd-b135de519a76\") " pod="openshift-ingress-canary/ingress-canary-q8rl9" Apr 16 19:30:55.144738 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:55.144367 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:30:55.144738 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:55.144528 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-metrics-tls\") pod \"dns-default-tv9rh\" (UID: \"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7\") " pod="openshift-dns/dns-default-tv9rh" Apr 16 19:30:55.144738 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:55.144550 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert podName:5ba63d7e-2cb5-4b7c-8bbd-b135de519a76 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:56.144531387 +0000 UTC m=+34.679545813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert") pod "ingress-canary-q8rl9" (UID: "5ba63d7e-2cb5-4b7c-8bbd-b135de519a76") : secret "canary-serving-cert" not found Apr 16 19:30:55.144738 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:55.144633 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:30:55.144738 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:55.144698 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-metrics-tls podName:39bb06a3-3f67-42dd-9b2d-1fef39ab08c7 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:56.144682627 +0000 UTC m=+34.679697068 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-metrics-tls") pod "dns-default-tv9rh" (UID: "39bb06a3-3f67-42dd-9b2d-1fef39ab08c7") : secret "dns-default-metrics-tls" not found Apr 16 19:30:55.647589 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:55.647507 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92q6k\" (UniqueName: \"kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k\") pod \"network-check-target-brgng\" (UID: \"73ec7853-4c17-4a50-b708-9ba3534b6b45\") " pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:55.647761 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:55.647672 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:55.647761 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:55.647696 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:55.647761 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:55.647706 2575 projected.go:194] Error preparing data for projected volume kube-api-access-92q6k for pod openshift-network-diagnostics/network-check-target-brgng: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:55.647761 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:55.647756 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k podName:73ec7853-4c17-4a50-b708-9ba3534b6b45 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:27.647742703 +0000 UTC m=+66.182757129 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-92q6k" (UniqueName: "kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k") pod "network-check-target-brgng" (UID: "73ec7853-4c17-4a50-b708-9ba3534b6b45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:55.748667 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:55.748639 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs\") pod \"network-metrics-daemon-lxtld\" (UID: \"046d5342-ca0b-4fe3-b388-6fa9f477de08\") " pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:55.748797 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:55.748780 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:55.748845 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:55.748835 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs podName:046d5342-ca0b-4fe3-b388-6fa9f477de08 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:27.74882376 +0000 UTC m=+66.283838188 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs") pod "network-metrics-daemon-lxtld" (UID: "046d5342-ca0b-4fe3-b388-6fa9f477de08") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:56.017655 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:56.017591 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:30:56.017786 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:56.017591 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:30:56.021706 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:56.021681 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:30:56.021841 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:56.021681 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:30:56.021841 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:56.021731 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p422q\"" Apr 16 19:30:56.021841 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:56.021691 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:30:56.021841 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:56.021680 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-52hfs\"" Apr 16 19:30:56.151322 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:56.151287 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-metrics-tls\") pod \"dns-default-tv9rh\" (UID: \"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7\") " pod="openshift-dns/dns-default-tv9rh" Apr 16 19:30:56.151720 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:56.151402 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:30:56.151720 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:56.151448 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-metrics-tls podName:39bb06a3-3f67-42dd-9b2d-1fef39ab08c7 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:58.15143515 +0000 UTC m=+36.686449576 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-metrics-tls") pod "dns-default-tv9rh" (UID: "39bb06a3-3f67-42dd-9b2d-1fef39ab08c7") : secret "dns-default-metrics-tls" not found Apr 16 19:30:56.151720 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:56.151443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert\") pod \"ingress-canary-q8rl9\" (UID: \"5ba63d7e-2cb5-4b7c-8bbd-b135de519a76\") " pod="openshift-ingress-canary/ingress-canary-q8rl9" Apr 16 19:30:56.151720 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:56.151566 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:30:56.151720 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:56.151649 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert podName:5ba63d7e-2cb5-4b7c-8bbd-b135de519a76 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:58.151611679 +0000 UTC m=+36.686626110 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert") pod "ingress-canary-q8rl9" (UID: "5ba63d7e-2cb5-4b7c-8bbd-b135de519a76") : secret "canary-serving-cert" not found Apr 16 19:30:56.181982 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:56.181959 2575 generic.go:358] "Generic (PLEG): container finished" podID="0c96c659-d972-4967-bf3d-e50d4088b9e5" containerID="67dc9127cc52e759cb58d5943697e5776c6d041b438a31f289f5557875da809b" exitCode=0 Apr 16 19:30:56.182118 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:56.181990 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sslrf" event={"ID":"0c96c659-d972-4967-bf3d-e50d4088b9e5","Type":"ContainerDied","Data":"67dc9127cc52e759cb58d5943697e5776c6d041b438a31f289f5557875da809b"} Apr 16 19:30:57.186435 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:57.186406 2575 generic.go:358] "Generic (PLEG): container finished" podID="0c96c659-d972-4967-bf3d-e50d4088b9e5" containerID="4d6e76d312a5db45bf36efddaf55c14274caf4475b3f2b4868a32e44dd8c4a46" exitCode=0 Apr 16 19:30:57.186755 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:57.186462 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sslrf" event={"ID":"0c96c659-d972-4967-bf3d-e50d4088b9e5","Type":"ContainerDied","Data":"4d6e76d312a5db45bf36efddaf55c14274caf4475b3f2b4868a32e44dd8c4a46"} Apr 16 19:30:58.166522 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:58.166489 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-metrics-tls\") pod \"dns-default-tv9rh\" (UID: \"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7\") " pod="openshift-dns/dns-default-tv9rh" Apr 16 19:30:58.166701 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:58.166546 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert\") pod \"ingress-canary-q8rl9\" (UID: \"5ba63d7e-2cb5-4b7c-8bbd-b135de519a76\") " pod="openshift-ingress-canary/ingress-canary-q8rl9" Apr 16 19:30:58.166701 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:58.166650 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:30:58.166801 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:58.166711 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert podName:5ba63d7e-2cb5-4b7c-8bbd-b135de519a76 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:02.166697145 +0000 UTC m=+40.701711584 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert") pod "ingress-canary-q8rl9" (UID: "5ba63d7e-2cb5-4b7c-8bbd-b135de519a76") : secret "canary-serving-cert" not found Apr 16 19:30:58.166801 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:58.166650 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:30:58.166887 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:30:58.166817 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-metrics-tls podName:39bb06a3-3f67-42dd-9b2d-1fef39ab08c7 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:02.166794544 +0000 UTC m=+40.701808995 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-metrics-tls") pod "dns-default-tv9rh" (UID: "39bb06a3-3f67-42dd-9b2d-1fef39ab08c7") : secret "dns-default-metrics-tls" not found Apr 16 19:30:58.190666 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:58.190639 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sslrf" event={"ID":"0c96c659-d972-4967-bf3d-e50d4088b9e5","Type":"ContainerStarted","Data":"7146c8c83eed1d9ca6dea0a7b4f51b945d96e1bc5fac8c28f79970875b6bb513"} Apr 16 19:30:58.216758 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:58.216722 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-sslrf" podStartSLOduration=5.657419719 podStartE2EDuration="36.216710082s" podCreationTimestamp="2026-04-16 19:30:22 +0000 UTC" firstStartedPulling="2026-04-16 19:30:24.499727805 +0000 UTC m=+3.034742231" lastFinishedPulling="2026-04-16 19:30:55.059018165 +0000 UTC m=+33.594032594" observedRunningTime="2026-04-16 19:30:58.215272068 +0000 UTC m=+36.750286516" watchObservedRunningTime="2026-04-16 19:30:58.216710082 +0000 UTC m=+36.751724530" Apr 16 19:30:58.492729 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:58.492701 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-xhdqf"] Apr 16 19:30:58.515107 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:58.515083 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-xhdqf"] Apr 16 19:30:58.515225 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:58.515162 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xhdqf" Apr 16 19:30:58.517468 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:58.517445 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 19:30:58.517607 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:58.517545 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 19:30:58.517672 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:58.517649 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-vnzd7\"" Apr 16 19:30:58.669592 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:58.669564 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjf28\" (UniqueName: \"kubernetes.io/projected/6c4fa5d0-8ce9-444f-9306-774a0f7d068c-kube-api-access-kjf28\") pod \"migrator-74bb7799d9-xhdqf\" (UID: \"6c4fa5d0-8ce9-444f-9306-774a0f7d068c\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xhdqf" Apr 16 19:30:58.770669 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:58.770601 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjf28\" (UniqueName: \"kubernetes.io/projected/6c4fa5d0-8ce9-444f-9306-774a0f7d068c-kube-api-access-kjf28\") pod \"migrator-74bb7799d9-xhdqf\" (UID: \"6c4fa5d0-8ce9-444f-9306-774a0f7d068c\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xhdqf" Apr 16 19:30:58.784674 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:58.784642 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjf28\" (UniqueName: \"kubernetes.io/projected/6c4fa5d0-8ce9-444f-9306-774a0f7d068c-kube-api-access-kjf28\") pod \"migrator-74bb7799d9-xhdqf\" (UID: \"6c4fa5d0-8ce9-444f-9306-774a0f7d068c\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xhdqf" Apr 16 19:30:58.823706 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:58.823685 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xhdqf" Apr 16 19:30:58.998201 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:58.998174 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-xhdqf"] Apr 16 19:30:59.003386 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:30:59.003360 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c4fa5d0_8ce9_444f_9306_774a0f7d068c.slice/crio-3ae49c9f0ce909fa411abf9e1f1652b776d5081840f1e5dcefec68406dc39607 WatchSource:0}: Error finding container 3ae49c9f0ce909fa411abf9e1f1652b776d5081840f1e5dcefec68406dc39607: Status 404 returned error can't find the container with id 3ae49c9f0ce909fa411abf9e1f1652b776d5081840f1e5dcefec68406dc39607 Apr 16 19:30:59.193403 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:30:59.193373 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xhdqf" event={"ID":"6c4fa5d0-8ce9-444f-9306-774a0f7d068c","Type":"ContainerStarted","Data":"3ae49c9f0ce909fa411abf9e1f1652b776d5081840f1e5dcefec68406dc39607"} Apr 16 19:31:01.110032 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.110009 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-7mlwf"] Apr 16 19:31:01.132455 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.132430 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-7mlwf"] Apr 16 19:31:01.132578 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.132536 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-7mlwf" Apr 16 19:31:01.135571 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.135552 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 19:31:01.136506 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.136485 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-zchq6\"" Apr 16 19:31:01.136506 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.136508 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 19:31:01.136686 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.136484 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 19:31:01.136686 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.136637 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 19:31:01.198593 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.198568 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xhdqf" event={"ID":"6c4fa5d0-8ce9-444f-9306-774a0f7d068c","Type":"ContainerStarted","Data":"d20e042b28c79a0573210908ffe028d6005cada6e4fd1b8d127fdb9a4849691b"} Apr 16 19:31:01.288318 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.288295 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxcjq\" (UniqueName: \"kubernetes.io/projected/740aa417-8ae4-4381-801e-42eb554bef2e-kube-api-access-qxcjq\") pod \"service-ca-865cb79987-7mlwf\" (UID: \"740aa417-8ae4-4381-801e-42eb554bef2e\") " pod="openshift-service-ca/service-ca-865cb79987-7mlwf" Apr 16 19:31:01.288408 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.288352 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/740aa417-8ae4-4381-801e-42eb554bef2e-signing-cabundle\") pod \"service-ca-865cb79987-7mlwf\" (UID: \"740aa417-8ae4-4381-801e-42eb554bef2e\") " pod="openshift-service-ca/service-ca-865cb79987-7mlwf" Apr 16 19:31:01.288408 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.288374 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/740aa417-8ae4-4381-801e-42eb554bef2e-signing-key\") pod \"service-ca-865cb79987-7mlwf\" (UID: \"740aa417-8ae4-4381-801e-42eb554bef2e\") " pod="openshift-service-ca/service-ca-865cb79987-7mlwf" Apr 16 19:31:01.389425 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.389402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/740aa417-8ae4-4381-801e-42eb554bef2e-signing-key\") pod \"service-ca-865cb79987-7mlwf\" (UID: \"740aa417-8ae4-4381-801e-42eb554bef2e\") " pod="openshift-service-ca/service-ca-865cb79987-7mlwf" Apr 16 19:31:01.389528 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.389448 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxcjq\" (UniqueName: \"kubernetes.io/projected/740aa417-8ae4-4381-801e-42eb554bef2e-kube-api-access-qxcjq\") pod \"service-ca-865cb79987-7mlwf\" (UID: \"740aa417-8ae4-4381-801e-42eb554bef2e\") " pod="openshift-service-ca/service-ca-865cb79987-7mlwf" Apr 16 19:31:01.389528 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.389509 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/740aa417-8ae4-4381-801e-42eb554bef2e-signing-cabundle\") pod \"service-ca-865cb79987-7mlwf\" (UID: \"740aa417-8ae4-4381-801e-42eb554bef2e\") " pod="openshift-service-ca/service-ca-865cb79987-7mlwf" Apr 16 19:31:01.390037 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.390015 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/740aa417-8ae4-4381-801e-42eb554bef2e-signing-cabundle\") pod \"service-ca-865cb79987-7mlwf\" (UID: \"740aa417-8ae4-4381-801e-42eb554bef2e\") " pod="openshift-service-ca/service-ca-865cb79987-7mlwf" Apr 16 19:31:01.392307 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.392288 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/740aa417-8ae4-4381-801e-42eb554bef2e-signing-key\") pod \"service-ca-865cb79987-7mlwf\" (UID: \"740aa417-8ae4-4381-801e-42eb554bef2e\") " pod="openshift-service-ca/service-ca-865cb79987-7mlwf" Apr 16 19:31:01.400916 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.400896 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxcjq\" (UniqueName: \"kubernetes.io/projected/740aa417-8ae4-4381-801e-42eb554bef2e-kube-api-access-qxcjq\") pod \"service-ca-865cb79987-7mlwf\" (UID: \"740aa417-8ae4-4381-801e-42eb554bef2e\") " pod="openshift-service-ca/service-ca-865cb79987-7mlwf" Apr 16 19:31:01.441121 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.441103 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-7mlwf" Apr 16 19:31:01.547722 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:01.547694 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-7mlwf"] Apr 16 19:31:01.550646 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:31:01.550623 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod740aa417_8ae4_4381_801e_42eb554bef2e.slice/crio-d256ebda03be43d38b51870a19285bb0eaa7f017e3442800eb174715211d1284 WatchSource:0}: Error finding container d256ebda03be43d38b51870a19285bb0eaa7f017e3442800eb174715211d1284: Status 404 returned error can't find the container with id d256ebda03be43d38b51870a19285bb0eaa7f017e3442800eb174715211d1284 Apr 16 19:31:02.195952 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:02.195918 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert\") pod \"ingress-canary-q8rl9\" (UID: \"5ba63d7e-2cb5-4b7c-8bbd-b135de519a76\") " pod="openshift-ingress-canary/ingress-canary-q8rl9" Apr 16 19:31:02.196403 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:02.195994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-metrics-tls\") pod \"dns-default-tv9rh\" (UID: \"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7\") " pod="openshift-dns/dns-default-tv9rh" Apr 16 19:31:02.196403 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:31:02.196089 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:31:02.196403 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:31:02.196140 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:31:02.196403 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:31:02.196152 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert podName:5ba63d7e-2cb5-4b7c-8bbd-b135de519a76 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:10.196136224 +0000 UTC m=+48.731150670 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert") pod "ingress-canary-q8rl9" (UID: "5ba63d7e-2cb5-4b7c-8bbd-b135de519a76") : secret "canary-serving-cert" not found Apr 16 19:31:02.196403 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:31:02.196219 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-metrics-tls podName:39bb06a3-3f67-42dd-9b2d-1fef39ab08c7 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:10.196200416 +0000 UTC m=+48.731214843 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-metrics-tls") pod "dns-default-tv9rh" (UID: "39bb06a3-3f67-42dd-9b2d-1fef39ab08c7") : secret "dns-default-metrics-tls" not found Apr 16 19:31:02.201581 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:02.201556 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-7mlwf" event={"ID":"740aa417-8ae4-4381-801e-42eb554bef2e","Type":"ContainerStarted","Data":"d256ebda03be43d38b51870a19285bb0eaa7f017e3442800eb174715211d1284"} Apr 16 19:31:02.203339 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:02.203315 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xhdqf" event={"ID":"6c4fa5d0-8ce9-444f-9306-774a0f7d068c","Type":"ContainerStarted","Data":"630214e21c1ecec7902a47f4c2582ea020570cd139366f185acf0239e2be7b52"} Apr 16 19:31:02.221810 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:02.221765 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xhdqf" podStartSLOduration=2.182079552 podStartE2EDuration="4.221752696s" podCreationTimestamp="2026-04-16 19:30:58 +0000 UTC" firstStartedPulling="2026-04-16 19:30:59.00560971 +0000 UTC m=+37.540624136" lastFinishedPulling="2026-04-16 19:31:01.045282849 +0000 UTC m=+39.580297280" observedRunningTime="2026-04-16 19:31:02.221505723 +0000 UTC m=+40.756520171" watchObservedRunningTime="2026-04-16 19:31:02.221752696 +0000 UTC m=+40.756767145" Apr 16 19:31:05.209517 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:05.209479 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-7mlwf" event={"ID":"740aa417-8ae4-4381-801e-42eb554bef2e","Type":"ContainerStarted","Data":"31a1e61ab0cbc77d4337bd863015c29325d61dea4ca17e670267d924d9a5a1e1"} Apr 16 19:31:05.225492 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:05.225449 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-7mlwf" podStartSLOduration=1.602202452 podStartE2EDuration="4.2254363s" podCreationTimestamp="2026-04-16 19:31:01 +0000 UTC" firstStartedPulling="2026-04-16 19:31:01.552554721 +0000 UTC m=+40.087569146" lastFinishedPulling="2026-04-16 19:31:04.175788568 +0000 UTC m=+42.710802994" observedRunningTime="2026-04-16 19:31:05.22476547 +0000 UTC m=+43.759779922" watchObservedRunningTime="2026-04-16 19:31:05.2254363 +0000 UTC m=+43.760450778" Apr 16 19:31:10.252236 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:10.252201 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-metrics-tls\") pod \"dns-default-tv9rh\" (UID: \"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7\") " pod="openshift-dns/dns-default-tv9rh" Apr 16 19:31:10.252627 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:10.252263 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert\") pod \"ingress-canary-q8rl9\" (UID: \"5ba63d7e-2cb5-4b7c-8bbd-b135de519a76\") " pod="openshift-ingress-canary/ingress-canary-q8rl9" Apr 16 19:31:10.252627 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:31:10.252374 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:31:10.252627 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:31:10.252443 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert podName:5ba63d7e-2cb5-4b7c-8bbd-b135de519a76 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:26.252425106 +0000 UTC m=+64.787439538 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert") pod "ingress-canary-q8rl9" (UID: "5ba63d7e-2cb5-4b7c-8bbd-b135de519a76") : secret "canary-serving-cert" not found Apr 16 19:31:10.254595 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:10.254573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/39bb06a3-3f67-42dd-9b2d-1fef39ab08c7-metrics-tls\") pod \"dns-default-tv9rh\" (UID: \"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7\") " pod="openshift-dns/dns-default-tv9rh" Apr 16 19:31:10.322928 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:10.322891 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tv9rh" Apr 16 19:31:10.431753 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:10.431725 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tv9rh"] Apr 16 19:31:10.435107 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:31:10.435082 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39bb06a3_3f67_42dd_9b2d_1fef39ab08c7.slice/crio-aaab398334bde4642c3dabc11528a3b6a32494b07ae29a567f672cd54c71c1fe WatchSource:0}: Error finding container aaab398334bde4642c3dabc11528a3b6a32494b07ae29a567f672cd54c71c1fe: Status 404 returned error can't find the container with id aaab398334bde4642c3dabc11528a3b6a32494b07ae29a567f672cd54c71c1fe Apr 16 19:31:11.221980 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:11.221942 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tv9rh" event={"ID":"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7","Type":"ContainerStarted","Data":"aaab398334bde4642c3dabc11528a3b6a32494b07ae29a567f672cd54c71c1fe"} Apr 16 19:31:13.227416 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:13.227373 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tv9rh" event={"ID":"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7","Type":"ContainerStarted","Data":"a3831191a05795c15619fbe621e78b10e421daee2c688d4cb285b7171ed0df96"} Apr 16 19:31:13.227416 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:13.227414 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tv9rh" event={"ID":"39bb06a3-3f67-42dd-9b2d-1fef39ab08c7","Type":"ContainerStarted","Data":"372a5b46fd5a9c372a7a143b77f38fa5e55cd3ae560d029e0edee77dc2fea783"} Apr 16 19:31:13.227944 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:13.227588 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-tv9rh" Apr 16 19:31:13.244064 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:13.244018 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tv9rh" podStartSLOduration=17.526717616 podStartE2EDuration="19.244008168s" podCreationTimestamp="2026-04-16 19:30:54 +0000 UTC" firstStartedPulling="2026-04-16 19:31:10.436990612 +0000 UTC m=+48.972005038" lastFinishedPulling="2026-04-16 19:31:12.154281164 +0000 UTC m=+50.689295590" observedRunningTime="2026-04-16 19:31:13.243909871 +0000 UTC m=+51.778924322" watchObservedRunningTime="2026-04-16 19:31:13.244008168 +0000 UTC m=+51.779022616" Apr 16 19:31:19.611248 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.611218 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-5r8bs"] Apr 16 19:31:19.628592 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.628559 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5"] Apr 16 19:31:19.628736 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.628717 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.631743 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.631721 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 19:31:19.634728 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.634710 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 19:31:19.635029 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.635014 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 19:31:19.642285 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.642264 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-5r8bs"] Apr 16 19:31:19.642415 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.642404 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.651854 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.651838 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 19:31:19.652094 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.652079 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-fj7mw\"" Apr 16 19:31:19.652483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.652465 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5"] Apr 16 19:31:19.654633 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.654617 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 19:31:19.655900 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.655875 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 19:31:19.656182 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.656169 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 19:31:19.656257 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.656242 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 19:31:19.656322 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.656252 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 19:31:19.656480 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.656466 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 19:31:19.656946 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.656931 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 19:31:19.657839 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.657823 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 19:31:19.716217 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.716192 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/979fdb4e-2eb8-47c2-ae58-03b07fefb3c4-ca\") pod \"cluster-proxy-proxy-agent-6c56767fbd-lh7f5\" (UID: \"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.716217 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.716217 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/979fdb4e-2eb8-47c2-ae58-03b07fefb3c4-hub\") pod \"cluster-proxy-proxy-agent-6c56767fbd-lh7f5\" (UID: \"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.716344 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.716263 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/979fdb4e-2eb8-47c2-ae58-03b07fefb3c4-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6c56767fbd-lh7f5\" (UID: \"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.716344 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.716292 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8j8b\" (UniqueName: \"kubernetes.io/projected/979fdb4e-2eb8-47c2-ae58-03b07fefb3c4-kube-api-access-l8j8b\") pod \"cluster-proxy-proxy-agent-6c56767fbd-lh7f5\" (UID: \"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.716344 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.716315 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/979fdb4e-2eb8-47c2-ae58-03b07fefb3c4-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6c56767fbd-lh7f5\" (UID: \"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.716449 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.716353 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/979fdb4e-2eb8-47c2-ae58-03b07fefb3c4-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6c56767fbd-lh7f5\" (UID: \"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.716449 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.716369 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4536532d-6693-4345-ac0d-083b15a27e72-tmp\") pod \"insights-operator-585dfdc468-5r8bs\" (UID: \"4536532d-6693-4345-ac0d-083b15a27e72\") " pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.716449 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.716404 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4536532d-6693-4345-ac0d-083b15a27e72-snapshots\") pod \"insights-operator-585dfdc468-5r8bs\" (UID: \"4536532d-6693-4345-ac0d-083b15a27e72\") " pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.716449 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.716422 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4536532d-6693-4345-ac0d-083b15a27e72-service-ca-bundle\") pod \"insights-operator-585dfdc468-5r8bs\" (UID: \"4536532d-6693-4345-ac0d-083b15a27e72\") " pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.716449 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.716439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf25g\" (UniqueName: \"kubernetes.io/projected/4536532d-6693-4345-ac0d-083b15a27e72-kube-api-access-qf25g\") pod \"insights-operator-585dfdc468-5r8bs\" (UID: \"4536532d-6693-4345-ac0d-083b15a27e72\") " pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.716612 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.716499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4536532d-6693-4345-ac0d-083b15a27e72-serving-cert\") pod \"insights-operator-585dfdc468-5r8bs\" (UID: \"4536532d-6693-4345-ac0d-083b15a27e72\") " pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.716612 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.716545 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4536532d-6693-4345-ac0d-083b15a27e72-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-5r8bs\" (UID: \"4536532d-6693-4345-ac0d-083b15a27e72\") " pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.817003 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.816981 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/979fdb4e-2eb8-47c2-ae58-03b07fefb3c4-ca\") pod \"cluster-proxy-proxy-agent-6c56767fbd-lh7f5\" (UID: \"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.817148 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.817007 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/979fdb4e-2eb8-47c2-ae58-03b07fefb3c4-hub\") pod \"cluster-proxy-proxy-agent-6c56767fbd-lh7f5\" (UID: \"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.817148 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.817028 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/979fdb4e-2eb8-47c2-ae58-03b07fefb3c4-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6c56767fbd-lh7f5\" (UID: \"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.817148 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.817045 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8j8b\" (UniqueName: \"kubernetes.io/projected/979fdb4e-2eb8-47c2-ae58-03b07fefb3c4-kube-api-access-l8j8b\") pod \"cluster-proxy-proxy-agent-6c56767fbd-lh7f5\" (UID: \"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.817148 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.817082 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/979fdb4e-2eb8-47c2-ae58-03b07fefb3c4-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6c56767fbd-lh7f5\" (UID: \"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.817345 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.817188 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/979fdb4e-2eb8-47c2-ae58-03b07fefb3c4-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6c56767fbd-lh7f5\" (UID: \"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.817345 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.817227 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4536532d-6693-4345-ac0d-083b15a27e72-tmp\") pod \"insights-operator-585dfdc468-5r8bs\" (UID: \"4536532d-6693-4345-ac0d-083b15a27e72\") " pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.817345 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.817253 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4536532d-6693-4345-ac0d-083b15a27e72-snapshots\") pod \"insights-operator-585dfdc468-5r8bs\" (UID: \"4536532d-6693-4345-ac0d-083b15a27e72\") " pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.817345 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.817283 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4536532d-6693-4345-ac0d-083b15a27e72-service-ca-bundle\") pod \"insights-operator-585dfdc468-5r8bs\" (UID: \"4536532d-6693-4345-ac0d-083b15a27e72\") " pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.817345 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.817321 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qf25g\" (UniqueName: \"kubernetes.io/projected/4536532d-6693-4345-ac0d-083b15a27e72-kube-api-access-qf25g\") pod \"insights-operator-585dfdc468-5r8bs\" (UID: \"4536532d-6693-4345-ac0d-083b15a27e72\") " pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.817600 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.817364 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4536532d-6693-4345-ac0d-083b15a27e72-serving-cert\") pod \"insights-operator-585dfdc468-5r8bs\" (UID: \"4536532d-6693-4345-ac0d-083b15a27e72\") " pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.817600 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.817395 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4536532d-6693-4345-ac0d-083b15a27e72-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-5r8bs\" (UID: \"4536532d-6693-4345-ac0d-083b15a27e72\") " pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.818044 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.818021 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4536532d-6693-4345-ac0d-083b15a27e72-snapshots\") pod \"insights-operator-585dfdc468-5r8bs\" (UID: \"4536532d-6693-4345-ac0d-083b15a27e72\") " pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.818148 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.818129 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/979fdb4e-2eb8-47c2-ae58-03b07fefb3c4-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6c56767fbd-lh7f5\" (UID: \"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.818431 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.818406 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4536532d-6693-4345-ac0d-083b15a27e72-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-5r8bs\" (UID: \"4536532d-6693-4345-ac0d-083b15a27e72\") " pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.818557 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.818457 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4536532d-6693-4345-ac0d-083b15a27e72-service-ca-bundle\") pod \"insights-operator-585dfdc468-5r8bs\" (UID: \"4536532d-6693-4345-ac0d-083b15a27e72\") " pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.818623 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.818603 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4536532d-6693-4345-ac0d-083b15a27e72-tmp\") pod \"insights-operator-585dfdc468-5r8bs\" (UID: \"4536532d-6693-4345-ac0d-083b15a27e72\") " pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.820298 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.820254 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4536532d-6693-4345-ac0d-083b15a27e72-serving-cert\") pod \"insights-operator-585dfdc468-5r8bs\" (UID: \"4536532d-6693-4345-ac0d-083b15a27e72\") " pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.820298 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.820272 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/979fdb4e-2eb8-47c2-ae58-03b07fefb3c4-ca\") pod \"cluster-proxy-proxy-agent-6c56767fbd-lh7f5\" (UID: \"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.820445 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.820310 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/979fdb4e-2eb8-47c2-ae58-03b07fefb3c4-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6c56767fbd-lh7f5\" (UID: \"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.820445 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.820320 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/979fdb4e-2eb8-47c2-ae58-03b07fefb3c4-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6c56767fbd-lh7f5\" (UID: \"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.820554 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.820539 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/979fdb4e-2eb8-47c2-ae58-03b07fefb3c4-hub\") pod \"cluster-proxy-proxy-agent-6c56767fbd-lh7f5\" (UID: \"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.840180 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.840152 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8j8b\" (UniqueName: \"kubernetes.io/projected/979fdb4e-2eb8-47c2-ae58-03b07fefb3c4-kube-api-access-l8j8b\") pod \"cluster-proxy-proxy-agent-6c56767fbd-lh7f5\" (UID: \"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:19.840288 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.840273 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf25g\" (UniqueName: \"kubernetes.io/projected/4536532d-6693-4345-ac0d-083b15a27e72-kube-api-access-qf25g\") pod \"insights-operator-585dfdc468-5r8bs\" (UID: \"4536532d-6693-4345-ac0d-083b15a27e72\") " pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.939528 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.939504 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-5r8bs" Apr 16 19:31:19.961320 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:19.961296 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" Apr 16 19:31:20.068761 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:20.068618 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-5r8bs"] Apr 16 19:31:20.071111 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:31:20.071082 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4536532d_6693_4345_ac0d_083b15a27e72.slice/crio-629abf79b84a10f6d84c50b19f359e1f60e742299b5e99030dbd3b72da9e4c3b WatchSource:0}: Error finding container 629abf79b84a10f6d84c50b19f359e1f60e742299b5e99030dbd3b72da9e4c3b: Status 404 returned error can't find the container with id 629abf79b84a10f6d84c50b19f359e1f60e742299b5e99030dbd3b72da9e4c3b Apr 16 19:31:20.086264 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:20.086243 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5"] Apr 16 19:31:20.089345 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:31:20.089322 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod979fdb4e_2eb8_47c2_ae58_03b07fefb3c4.slice/crio-7a1401baee6a78142cf4f8866ab4702dc9450f764e9c4b646504710cdd98dffb WatchSource:0}: Error finding container 7a1401baee6a78142cf4f8866ab4702dc9450f764e9c4b646504710cdd98dffb: Status 404 returned error can't find the container with id 7a1401baee6a78142cf4f8866ab4702dc9450f764e9c4b646504710cdd98dffb Apr 16 19:31:20.179161 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:20.179137 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gxnqw" Apr 16 19:31:20.242015 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:20.241931 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" event={"ID":"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4","Type":"ContainerStarted","Data":"7a1401baee6a78142cf4f8866ab4702dc9450f764e9c4b646504710cdd98dffb"} Apr 16 19:31:20.244717 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:20.244679 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5r8bs" event={"ID":"4536532d-6693-4345-ac0d-083b15a27e72","Type":"ContainerStarted","Data":"629abf79b84a10f6d84c50b19f359e1f60e742299b5e99030dbd3b72da9e4c3b"} Apr 16 19:31:23.231693 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:23.231665 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tv9rh" Apr 16 19:31:24.254663 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:24.254622 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" event={"ID":"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4","Type":"ContainerStarted","Data":"e68cef6c08db17105f587a9cd3d3c89fcce687d53ee5c082f7c33a33245c751b"} Apr 16 19:31:24.256244 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:24.256215 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5r8bs" event={"ID":"4536532d-6693-4345-ac0d-083b15a27e72","Type":"ContainerStarted","Data":"5d5fb51acab22ec53d0f15305bb02d65bb7380acf8ed1d1d71b4b8c1262f3c37"} Apr 16 19:31:24.273745 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:24.273693 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-5r8bs" podStartSLOduration=2.054855398 podStartE2EDuration="5.273675448s" podCreationTimestamp="2026-04-16 19:31:19 +0000 UTC" firstStartedPulling="2026-04-16 19:31:20.0730463 +0000 UTC m=+58.608060730" lastFinishedPulling="2026-04-16 19:31:23.29186635 +0000 UTC m=+61.826880780" observedRunningTime="2026-04-16 19:31:24.272829005 +0000 UTC m=+62.807843454" watchObservedRunningTime="2026-04-16 19:31:24.273675448 +0000 UTC m=+62.808689897" Apr 16 19:31:25.722911 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:25.722851 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tv9rh_39bb06a3-3f67-42dd-9b2d-1fef39ab08c7/dns/0.log" Apr 16 19:31:25.906540 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:25.906512 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tv9rh_39bb06a3-3f67-42dd-9b2d-1fef39ab08c7/kube-rbac-proxy/0.log" Apr 16 19:31:26.263834 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:26.263803 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" event={"ID":"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4","Type":"ContainerStarted","Data":"d3339e6a4addefabba9ff403aa5238b68e9c07cb03c67dc5da49cead7b5c7e35"} Apr 16 19:31:26.263834 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:26.263835 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" event={"ID":"979fdb4e-2eb8-47c2-ae58-03b07fefb3c4","Type":"ContainerStarted","Data":"f07e5f907d126dd5f7db615efd3b101a51f702daa366fb08b15afa100f190175"} Apr 16 19:31:26.267975 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:26.267956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert\") pod \"ingress-canary-q8rl9\" (UID: \"5ba63d7e-2cb5-4b7c-8bbd-b135de519a76\") " pod="openshift-ingress-canary/ingress-canary-q8rl9" Apr 16 19:31:26.270227 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:26.270212 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ba63d7e-2cb5-4b7c-8bbd-b135de519a76-cert\") pod \"ingress-canary-q8rl9\" (UID: \"5ba63d7e-2cb5-4b7c-8bbd-b135de519a76\") " pod="openshift-ingress-canary/ingress-canary-q8rl9" Apr 16 19:31:26.283924 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:26.283879 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c56767fbd-lh7f5" podStartSLOduration=1.934830704 podStartE2EDuration="7.28386806s" podCreationTimestamp="2026-04-16 19:31:19 +0000 UTC" firstStartedPulling="2026-04-16 19:31:20.090853624 +0000 UTC m=+58.625868050" lastFinishedPulling="2026-04-16 19:31:25.43989098 +0000 UTC m=+63.974905406" observedRunningTime="2026-04-16 19:31:26.28345748 +0000 UTC m=+64.818471928" watchObservedRunningTime="2026-04-16 19:31:26.28386806 +0000 UTC m=+64.818882508" Apr 16 19:31:26.505526 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:26.505502 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9dg6k_ca3720a8-e72a-4e65-a1f7-4270435ae4e1/dns-node-resolver/0.log" Apr 16 19:31:26.539487 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:26.539435 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fqt5t\"" Apr 16 19:31:26.547352 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:26.547334 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q8rl9" Apr 16 19:31:26.656484 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:26.656455 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q8rl9"] Apr 16 19:31:26.659213 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:31:26.659182 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ba63d7e_2cb5_4b7c_8bbd_b135de519a76.slice/crio-cac84efaca8bb5f1f113177cce79b6956c12e95476013c0d91a4f5db264da34e WatchSource:0}: Error finding container cac84efaca8bb5f1f113177cce79b6956c12e95476013c0d91a4f5db264da34e: Status 404 returned error can't find the container with id cac84efaca8bb5f1f113177cce79b6956c12e95476013c0d91a4f5db264da34e Apr 16 19:31:27.267946 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:27.267904 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q8rl9" event={"ID":"5ba63d7e-2cb5-4b7c-8bbd-b135de519a76","Type":"ContainerStarted","Data":"cac84efaca8bb5f1f113177cce79b6956c12e95476013c0d91a4f5db264da34e"} Apr 16 19:31:27.506414 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:27.506388 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5f289_e76a993d-4a08-49b0-a7f9-dc97575009ad/node-ca/0.log" Apr 16 19:31:27.677087 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:27.677039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92q6k\" (UniqueName: \"kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k\") pod \"network-check-target-brgng\" (UID: \"73ec7853-4c17-4a50-b708-9ba3534b6b45\") " pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:31:27.679606 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:27.679587 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:31:27.689797 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:27.689772 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:31:27.709900 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:27.709878 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92q6k\" (UniqueName: \"kubernetes.io/projected/73ec7853-4c17-4a50-b708-9ba3534b6b45-kube-api-access-92q6k\") pod \"network-check-target-brgng\" (UID: \"73ec7853-4c17-4a50-b708-9ba3534b6b45\") " pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:31:27.777957 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:27.777927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs\") pod \"network-metrics-daemon-lxtld\" (UID: \"046d5342-ca0b-4fe3-b388-6fa9f477de08\") " pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:31:27.780592 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:27.780572 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:31:27.790767 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:27.790745 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/046d5342-ca0b-4fe3-b388-6fa9f477de08-metrics-certs\") pod \"network-metrics-daemon-lxtld\" (UID: \"046d5342-ca0b-4fe3-b388-6fa9f477de08\") " pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:31:27.830400 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:27.830375 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p422q\"" Apr 16 19:31:27.835129 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:27.835111 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-52hfs\"" Apr 16 19:31:27.838171 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:27.838157 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lxtld" Apr 16 19:31:27.842759 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:27.842738 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:31:27.980005 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:27.979976 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lxtld"] Apr 16 19:31:27.989082 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:27.989026 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-f9rdz"] Apr 16 19:31:28.011360 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.011337 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-brgng"] Apr 16 19:31:28.011471 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.011366 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-f9rdz"] Apr 16 19:31:28.011529 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.011498 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-f9rdz" Apr 16 19:31:28.014442 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.014271 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 19:31:28.014442 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.014316 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jq9hf\"" Apr 16 19:31:28.014442 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.014342 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 19:31:28.080383 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.080322 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2a2ce730-a721-4a99-8ce5-0e1c3344a897-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f9rdz\" (UID: \"2a2ce730-a721-4a99-8ce5-0e1c3344a897\") " pod="openshift-insights/insights-runtime-extractor-f9rdz" Apr 16 19:31:28.080383 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.080374 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2a2ce730-a721-4a99-8ce5-0e1c3344a897-crio-socket\") pod \"insights-runtime-extractor-f9rdz\" (UID: \"2a2ce730-a721-4a99-8ce5-0e1c3344a897\") " pod="openshift-insights/insights-runtime-extractor-f9rdz" Apr 16 19:31:28.080549 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.080410 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psvm5\" (UniqueName: \"kubernetes.io/projected/2a2ce730-a721-4a99-8ce5-0e1c3344a897-kube-api-access-psvm5\") pod \"insights-runtime-extractor-f9rdz\" (UID: \"2a2ce730-a721-4a99-8ce5-0e1c3344a897\") " pod="openshift-insights/insights-runtime-extractor-f9rdz" Apr 16 19:31:28.080549 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.080441 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2a2ce730-a721-4a99-8ce5-0e1c3344a897-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f9rdz\" (UID: \"2a2ce730-a721-4a99-8ce5-0e1c3344a897\") " pod="openshift-insights/insights-runtime-extractor-f9rdz" Apr 16 19:31:28.080549 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.080501 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2a2ce730-a721-4a99-8ce5-0e1c3344a897-data-volume\") pod \"insights-runtime-extractor-f9rdz\" (UID: \"2a2ce730-a721-4a99-8ce5-0e1c3344a897\") " pod="openshift-insights/insights-runtime-extractor-f9rdz" Apr 16 19:31:28.181253 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.181187 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psvm5\" (UniqueName: \"kubernetes.io/projected/2a2ce730-a721-4a99-8ce5-0e1c3344a897-kube-api-access-psvm5\") pod \"insights-runtime-extractor-f9rdz\" (UID: \"2a2ce730-a721-4a99-8ce5-0e1c3344a897\") " pod="openshift-insights/insights-runtime-extractor-f9rdz" Apr 16 19:31:28.181253 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.181230 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2a2ce730-a721-4a99-8ce5-0e1c3344a897-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f9rdz\" (UID: \"2a2ce730-a721-4a99-8ce5-0e1c3344a897\") " pod="openshift-insights/insights-runtime-extractor-f9rdz" Apr 16 19:31:28.181443 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.181260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2a2ce730-a721-4a99-8ce5-0e1c3344a897-data-volume\") pod \"insights-runtime-extractor-f9rdz\" (UID: \"2a2ce730-a721-4a99-8ce5-0e1c3344a897\") " pod="openshift-insights/insights-runtime-extractor-f9rdz" Apr 16 19:31:28.181443 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.181317 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2a2ce730-a721-4a99-8ce5-0e1c3344a897-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f9rdz\" (UID: \"2a2ce730-a721-4a99-8ce5-0e1c3344a897\") " pod="openshift-insights/insights-runtime-extractor-f9rdz" Apr 16 19:31:28.181443 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:31:28.181353 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 19:31:28.181443 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:31:28.181426 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a2ce730-a721-4a99-8ce5-0e1c3344a897-insights-runtime-extractor-tls podName:2a2ce730-a721-4a99-8ce5-0e1c3344a897 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:28.681403551 +0000 UTC m=+67.216417992 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/2a2ce730-a721-4a99-8ce5-0e1c3344a897-insights-runtime-extractor-tls") pod "insights-runtime-extractor-f9rdz" (UID: "2a2ce730-a721-4a99-8ce5-0e1c3344a897") : secret "insights-runtime-extractor-tls" not found Apr 16 19:31:28.181644 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.181359 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2a2ce730-a721-4a99-8ce5-0e1c3344a897-crio-socket\") pod \"insights-runtime-extractor-f9rdz\" (UID: \"2a2ce730-a721-4a99-8ce5-0e1c3344a897\") " pod="openshift-insights/insights-runtime-extractor-f9rdz" Apr 16 19:31:28.181644 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.181636 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2a2ce730-a721-4a99-8ce5-0e1c3344a897-crio-socket\") pod \"insights-runtime-extractor-f9rdz\" (UID: \"2a2ce730-a721-4a99-8ce5-0e1c3344a897\") " pod="openshift-insights/insights-runtime-extractor-f9rdz" Apr 16 19:31:28.181747 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.181669 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2a2ce730-a721-4a99-8ce5-0e1c3344a897-data-volume\") pod \"insights-runtime-extractor-f9rdz\" (UID: \"2a2ce730-a721-4a99-8ce5-0e1c3344a897\") " pod="openshift-insights/insights-runtime-extractor-f9rdz" Apr 16 19:31:28.181924 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.181906 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2a2ce730-a721-4a99-8ce5-0e1c3344a897-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f9rdz\" (UID: \"2a2ce730-a721-4a99-8ce5-0e1c3344a897\") " pod="openshift-insights/insights-runtime-extractor-f9rdz" Apr 16 19:31:28.191905 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.191878 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psvm5\" (UniqueName: \"kubernetes.io/projected/2a2ce730-a721-4a99-8ce5-0e1c3344a897-kube-api-access-psvm5\") pod \"insights-runtime-extractor-f9rdz\" (UID: \"2a2ce730-a721-4a99-8ce5-0e1c3344a897\") " pod="openshift-insights/insights-runtime-extractor-f9rdz" Apr 16 19:31:28.457848 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:31:28.457769 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod046d5342_ca0b_4fe3_b388_6fa9f477de08.slice/crio-01101c97a767bc9fd94c9905b8f7e4638392013494da3b60496ad5c47957766d WatchSource:0}: Error finding container 01101c97a767bc9fd94c9905b8f7e4638392013494da3b60496ad5c47957766d: Status 404 returned error can't find the container with id 01101c97a767bc9fd94c9905b8f7e4638392013494da3b60496ad5c47957766d Apr 16 19:31:28.458517 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:31:28.458492 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73ec7853_4c17_4a50_b708_9ba3534b6b45.slice/crio-227114d883aef88a49bdf586fafc71b62351415e63db75126d1aeb0280a27257 WatchSource:0}: Error finding container 227114d883aef88a49bdf586fafc71b62351415e63db75126d1aeb0280a27257: Status 404 returned error can't find the container with id 227114d883aef88a49bdf586fafc71b62351415e63db75126d1aeb0280a27257 Apr 16 19:31:28.685198 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.685171 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2a2ce730-a721-4a99-8ce5-0e1c3344a897-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f9rdz\" (UID: \"2a2ce730-a721-4a99-8ce5-0e1c3344a897\") " pod="openshift-insights/insights-runtime-extractor-f9rdz" Apr 16 19:31:28.687272 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.687251 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2a2ce730-a721-4a99-8ce5-0e1c3344a897-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f9rdz\" (UID: \"2a2ce730-a721-4a99-8ce5-0e1c3344a897\") " pod="openshift-insights/insights-runtime-extractor-f9rdz" Apr 16 19:31:28.921931 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:28.921895 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-f9rdz" Apr 16 19:31:29.067704 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:29.067676 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-f9rdz"] Apr 16 19:31:29.071969 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:31:29.071937 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a2ce730_a721_4a99_8ce5_0e1c3344a897.slice/crio-f917641178a8b7a3374a08658e99173d3363aa4314509c7f550af969031c8435 WatchSource:0}: Error finding container f917641178a8b7a3374a08658e99173d3363aa4314509c7f550af969031c8435: Status 404 returned error can't find the container with id f917641178a8b7a3374a08658e99173d3363aa4314509c7f550af969031c8435 Apr 16 19:31:29.275959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:29.275873 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q8rl9" event={"ID":"5ba63d7e-2cb5-4b7c-8bbd-b135de519a76","Type":"ContainerStarted","Data":"facb9de07dc00641e44bd1b763574a8694a335ecfb3f2d0f736439e5335cedea"} Apr 16 19:31:29.277629 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:29.277594 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f9rdz" event={"ID":"2a2ce730-a721-4a99-8ce5-0e1c3344a897","Type":"ContainerStarted","Data":"fe35ef8ff8316dde007de74664a771a56c66b2a26731e0a163c51e11d65907f2"} Apr 16 19:31:29.277629 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:29.277623 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f9rdz" event={"ID":"2a2ce730-a721-4a99-8ce5-0e1c3344a897","Type":"ContainerStarted","Data":"f917641178a8b7a3374a08658e99173d3363aa4314509c7f550af969031c8435"} Apr 16 19:31:29.278823 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:29.278795 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lxtld" event={"ID":"046d5342-ca0b-4fe3-b388-6fa9f477de08","Type":"ContainerStarted","Data":"01101c97a767bc9fd94c9905b8f7e4638392013494da3b60496ad5c47957766d"} Apr 16 19:31:29.279921 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:29.279899 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-brgng" event={"ID":"73ec7853-4c17-4a50-b708-9ba3534b6b45","Type":"ContainerStarted","Data":"227114d883aef88a49bdf586fafc71b62351415e63db75126d1aeb0280a27257"} Apr 16 19:31:29.291287 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:29.291249 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-q8rl9" podStartSLOduration=33.33085206 podStartE2EDuration="35.291236978s" podCreationTimestamp="2026-04-16 19:30:54 +0000 UTC" firstStartedPulling="2026-04-16 19:31:26.661275965 +0000 UTC m=+65.196290394" lastFinishedPulling="2026-04-16 19:31:28.621660886 +0000 UTC m=+67.156675312" observedRunningTime="2026-04-16 19:31:29.291170549 +0000 UTC m=+67.826184999" watchObservedRunningTime="2026-04-16 19:31:29.291236978 +0000 UTC m=+67.826251426" Apr 16 19:31:30.285127 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:30.285087 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f9rdz" event={"ID":"2a2ce730-a721-4a99-8ce5-0e1c3344a897","Type":"ContainerStarted","Data":"a4e67de58ab7a200a2bad86d96a20ddaa2ff5ee4d80002accb06f7f36f8b546d"} Apr 16 19:31:30.286726 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:30.286694 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lxtld" event={"ID":"046d5342-ca0b-4fe3-b388-6fa9f477de08","Type":"ContainerStarted","Data":"512e86ca65b636650772ab0c061a88df3934577ebf2542e3a7ef327ee2328225"} Apr 16 19:31:30.286844 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:30.286731 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lxtld" event={"ID":"046d5342-ca0b-4fe3-b388-6fa9f477de08","Type":"ContainerStarted","Data":"44d49f634ae546a6ebda26f53577135cf56a16e6c330e888c8ca07d37f618a2e"} Apr 16 19:31:30.304257 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:30.304196 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lxtld" podStartSLOduration=67.154552357 podStartE2EDuration="1m8.304175871s" podCreationTimestamp="2026-04-16 19:30:22 +0000 UTC" firstStartedPulling="2026-04-16 19:31:28.45961693 +0000 UTC m=+66.994631356" lastFinishedPulling="2026-04-16 19:31:29.609240424 +0000 UTC m=+68.144254870" observedRunningTime="2026-04-16 19:31:30.303224776 +0000 UTC m=+68.838239225" watchObservedRunningTime="2026-04-16 19:31:30.304175871 +0000 UTC m=+68.839190320" Apr 16 19:31:32.293285 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:32.293257 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f9rdz" event={"ID":"2a2ce730-a721-4a99-8ce5-0e1c3344a897","Type":"ContainerStarted","Data":"b7188d599438d2c7e8786126a99c0c23e36e99b42bda5aacca5ddf00e9395478"} Apr 16 19:31:32.294540 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:32.294517 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-brgng" event={"ID":"73ec7853-4c17-4a50-b708-9ba3534b6b45","Type":"ContainerStarted","Data":"4ab6dfef99371653b2b98c234e1fe4b9226340a2091dbb67a1e22e71356507fa"} Apr 16 19:31:32.294652 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:32.294634 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:31:32.313895 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:32.313847 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-f9rdz" podStartSLOduration=2.277514899 podStartE2EDuration="5.313833616s" podCreationTimestamp="2026-04-16 19:31:27 +0000 UTC" firstStartedPulling="2026-04-16 19:31:29.152358468 +0000 UTC m=+67.687372897" lastFinishedPulling="2026-04-16 19:31:32.188677183 +0000 UTC m=+70.723691614" observedRunningTime="2026-04-16 19:31:32.312318837 +0000 UTC m=+70.847333284" watchObservedRunningTime="2026-04-16 19:31:32.313833616 +0000 UTC m=+70.848848064" Apr 16 19:31:32.328280 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:32.328243 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-brgng" podStartSLOduration=67.531735489 podStartE2EDuration="1m10.328227833s" podCreationTimestamp="2026-04-16 19:30:22 +0000 UTC" firstStartedPulling="2026-04-16 19:31:28.460286624 +0000 UTC m=+66.995301050" lastFinishedPulling="2026-04-16 19:31:31.256778955 +0000 UTC m=+69.791793394" observedRunningTime="2026-04-16 19:31:32.327595698 +0000 UTC m=+70.862610147" watchObservedRunningTime="2026-04-16 19:31:32.328227833 +0000 UTC m=+70.863242282" Apr 16 19:31:34.991183 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:34.991149 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7jf9f"] Apr 16 19:31:35.028028 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.028001 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.031101 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.030661 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 19:31:35.031785 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.031769 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 19:31:35.032289 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.032275 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 19:31:35.032778 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.032760 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 19:31:35.033619 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.033466 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qtsms\"" Apr 16 19:31:35.033619 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.033565 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 19:31:35.033813 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.033756 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 19:31:35.132229 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.132202 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/05235b17-5e57-4ce3-b9d2-70856ad228d0-root\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.132441 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.132428 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/05235b17-5e57-4ce3-b9d2-70856ad228d0-node-exporter-textfile\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.132555 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.132545 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05235b17-5e57-4ce3-b9d2-70856ad228d0-metrics-client-ca\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.132713 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.132691 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/05235b17-5e57-4ce3-b9d2-70856ad228d0-node-exporter-accelerators-collector-config\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.132834 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.132789 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqx5w\" (UniqueName: \"kubernetes.io/projected/05235b17-5e57-4ce3-b9d2-70856ad228d0-kube-api-access-xqx5w\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.132834 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.132824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05235b17-5e57-4ce3-b9d2-70856ad228d0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.132952 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.132853 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/05235b17-5e57-4ce3-b9d2-70856ad228d0-sys\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.132952 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.132917 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/05235b17-5e57-4ce3-b9d2-70856ad228d0-node-exporter-tls\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.133039 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.132949 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/05235b17-5e57-4ce3-b9d2-70856ad228d0-node-exporter-wtmp\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.233516 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.233484 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05235b17-5e57-4ce3-b9d2-70856ad228d0-metrics-client-ca\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.233772 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.233751 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/05235b17-5e57-4ce3-b9d2-70856ad228d0-node-exporter-accelerators-collector-config\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.233933 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.233911 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqx5w\" (UniqueName: \"kubernetes.io/projected/05235b17-5e57-4ce3-b9d2-70856ad228d0-kube-api-access-xqx5w\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.234071 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.233952 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05235b17-5e57-4ce3-b9d2-70856ad228d0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.234071 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.233984 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/05235b17-5e57-4ce3-b9d2-70856ad228d0-sys\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.234071 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.234037 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/05235b17-5e57-4ce3-b9d2-70856ad228d0-node-exporter-tls\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.234244 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.234093 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/05235b17-5e57-4ce3-b9d2-70856ad228d0-node-exporter-wtmp\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.234244 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.234123 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05235b17-5e57-4ce3-b9d2-70856ad228d0-metrics-client-ca\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.234244 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.234128 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/05235b17-5e57-4ce3-b9d2-70856ad228d0-root\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.234244 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.234180 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/05235b17-5e57-4ce3-b9d2-70856ad228d0-node-exporter-textfile\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.234244 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.234204 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/05235b17-5e57-4ce3-b9d2-70856ad228d0-root\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.234505 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.234262 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/05235b17-5e57-4ce3-b9d2-70856ad228d0-node-exporter-wtmp\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.234505 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.234326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/05235b17-5e57-4ce3-b9d2-70856ad228d0-node-exporter-accelerators-collector-config\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.234505 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.234386 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/05235b17-5e57-4ce3-b9d2-70856ad228d0-sys\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.234505 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.234445 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/05235b17-5e57-4ce3-b9d2-70856ad228d0-node-exporter-textfile\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.236736 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.236700 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/05235b17-5e57-4ce3-b9d2-70856ad228d0-node-exporter-tls\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.236909 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.236890 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05235b17-5e57-4ce3-b9d2-70856ad228d0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.246877 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.246822 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqx5w\" (UniqueName: \"kubernetes.io/projected/05235b17-5e57-4ce3-b9d2-70856ad228d0-kube-api-access-xqx5w\") pod \"node-exporter-7jf9f\" (UID: \"05235b17-5e57-4ce3-b9d2-70856ad228d0\") " pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.341026 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:35.340995 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7jf9f" Apr 16 19:31:35.349406 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:31:35.349364 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05235b17_5e57_4ce3_b9d2_70856ad228d0.slice/crio-872d9be5035f9a16e4748aaf45e1b2081003e5918d7899cdfbd0b7bbc2d1829c WatchSource:0}: Error finding container 872d9be5035f9a16e4748aaf45e1b2081003e5918d7899cdfbd0b7bbc2d1829c: Status 404 returned error can't find the container with id 872d9be5035f9a16e4748aaf45e1b2081003e5918d7899cdfbd0b7bbc2d1829c Apr 16 19:31:36.086586 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.084267 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:31:36.093420 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.092901 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.096210 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.096004 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 19:31:36.096210 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.096014 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 19:31:36.096368 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.096262 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 19:31:36.096491 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.096460 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 19:31:36.096594 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.096515 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-mkswb\"" Apr 16 19:31:36.096594 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.096570 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 19:31:36.096716 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.096460 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 19:31:36.096786 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.096768 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 19:31:36.096992 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.096963 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 19:31:36.097326 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.097306 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 19:31:36.114954 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.114508 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:31:36.142540 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.141368 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.142540 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.141432 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3db10b2f-00c7-4d42-b997-83452eae36da-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.142540 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.141470 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlr7z\" (UniqueName: \"kubernetes.io/projected/3db10b2f-00c7-4d42-b997-83452eae36da-kube-api-access-dlr7z\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.142540 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.141505 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-config-volume\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.142540 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.141540 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.142540 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.141566 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3db10b2f-00c7-4d42-b997-83452eae36da-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.142540 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.141596 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.142540 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.141637 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.142540 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.141675 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.142540 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.141703 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3db10b2f-00c7-4d42-b997-83452eae36da-config-out\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.142540 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.141739 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-web-config\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.142540 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.141764 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3db10b2f-00c7-4d42-b997-83452eae36da-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.142540 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.141796 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3db10b2f-00c7-4d42-b997-83452eae36da-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.242362 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.242324 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-config-volume\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.242528 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.242379 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.242528 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.242399 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3db10b2f-00c7-4d42-b997-83452eae36da-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.242528 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.242484 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.242687 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.242545 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.242687 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.242591 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.242687 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.242622 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3db10b2f-00c7-4d42-b997-83452eae36da-config-out\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.242687 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.242656 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-web-config\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.242687 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.242679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3db10b2f-00c7-4d42-b997-83452eae36da-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.242923 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.242710 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3db10b2f-00c7-4d42-b997-83452eae36da-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.242923 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.242745 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.242923 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.242787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3db10b2f-00c7-4d42-b997-83452eae36da-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.242923 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.242823 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlr7z\" (UniqueName: \"kubernetes.io/projected/3db10b2f-00c7-4d42-b997-83452eae36da-kube-api-access-dlr7z\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.243307 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.243181 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3db10b2f-00c7-4d42-b997-83452eae36da-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.243307 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:31:36.243294 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3db10b2f-00c7-4d42-b997-83452eae36da-alertmanager-trusted-ca-bundle podName:3db10b2f-00c7-4d42-b997-83452eae36da nodeName:}" failed. No retries permitted until 2026-04-16 19:31:36.743274204 +0000 UTC m=+75.278288632 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/3db10b2f-00c7-4d42-b997-83452eae36da-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "3db10b2f-00c7-4d42-b997-83452eae36da") : configmap references non-existent config key: ca-bundle.crt Apr 16 19:31:36.243724 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.243670 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3db10b2f-00c7-4d42-b997-83452eae36da-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.245603 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.245578 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3db10b2f-00c7-4d42-b997-83452eae36da-config-out\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.246219 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.246194 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3db10b2f-00c7-4d42-b997-83452eae36da-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.247159 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.247124 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-config-volume\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.247291 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.247268 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.247291 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.247285 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.247694 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.247624 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-web-config\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.247694 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.247648 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.247815 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.247735 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.247881 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.247854 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.256816 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.256784 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlr7z\" (UniqueName: \"kubernetes.io/projected/3db10b2f-00c7-4d42-b997-83452eae36da-kube-api-access-dlr7z\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.305459 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.305423 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7jf9f" event={"ID":"05235b17-5e57-4ce3-b9d2-70856ad228d0","Type":"ContainerStarted","Data":"872d9be5035f9a16e4748aaf45e1b2081003e5918d7899cdfbd0b7bbc2d1829c"} Apr 16 19:31:36.747673 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.747597 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3db10b2f-00c7-4d42-b997-83452eae36da-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:36.748397 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:36.748373 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3db10b2f-00c7-4d42-b997-83452eae36da-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:37.005622 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:37.005550 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:31:37.151840 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:37.151803 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:31:37.154685 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:31:37.154641 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3db10b2f_00c7_4d42_b997_83452eae36da.slice/crio-fa3791aeab961b5ee95f8c5bab9459aeefb330c7309326efdb396eb2bb43b900 WatchSource:0}: Error finding container fa3791aeab961b5ee95f8c5bab9459aeefb330c7309326efdb396eb2bb43b900: Status 404 returned error can't find the container with id fa3791aeab961b5ee95f8c5bab9459aeefb330c7309326efdb396eb2bb43b900 Apr 16 19:31:37.309592 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:37.309464 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3db10b2f-00c7-4d42-b997-83452eae36da","Type":"ContainerStarted","Data":"fa3791aeab961b5ee95f8c5bab9459aeefb330c7309326efdb396eb2bb43b900"} Apr 16 19:31:37.311132 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:37.311104 2575 generic.go:358] "Generic (PLEG): container finished" podID="05235b17-5e57-4ce3-b9d2-70856ad228d0" containerID="830fa7aaa2d6bea5c62a1a8491d1d3f7ec23ec43a037f1c2c00fa7189429d731" exitCode=0 Apr 16 19:31:37.311261 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:37.311187 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7jf9f" event={"ID":"05235b17-5e57-4ce3-b9d2-70856ad228d0","Type":"ContainerDied","Data":"830fa7aaa2d6bea5c62a1a8491d1d3f7ec23ec43a037f1c2c00fa7189429d731"} Apr 16 19:31:38.315411 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:38.315377 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7jf9f" event={"ID":"05235b17-5e57-4ce3-b9d2-70856ad228d0","Type":"ContainerStarted","Data":"7cee9600aa32cbb3743870524615a6b362651f1b73a438d735889d5339ebdb34"} Apr 16 19:31:38.315411 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:38.315415 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7jf9f" event={"ID":"05235b17-5e57-4ce3-b9d2-70856ad228d0","Type":"ContainerStarted","Data":"afd4184dbd00c6d55b97766677d98907673732d817f2846ea97db0978bcca412"} Apr 16 19:31:38.337394 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:38.337337 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7jf9f" podStartSLOduration=3.284582018 podStartE2EDuration="4.337324154s" podCreationTimestamp="2026-04-16 19:31:34 +0000 UTC" firstStartedPulling="2026-04-16 19:31:35.351470486 +0000 UTC m=+73.886484933" lastFinishedPulling="2026-04-16 19:31:36.404212637 +0000 UTC m=+74.939227069" observedRunningTime="2026-04-16 19:31:38.335273031 +0000 UTC m=+76.870287480" watchObservedRunningTime="2026-04-16 19:31:38.337324154 +0000 UTC m=+76.872338602" Apr 16 19:31:39.319179 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.319141 2575 generic.go:358] "Generic (PLEG): container finished" podID="3db10b2f-00c7-4d42-b997-83452eae36da" containerID="eb6dfa6f30213c718e2bad2ce76eae4574d5ddb530288b90eb7c6b57c68a0b3c" exitCode=0 Apr 16 19:31:39.319731 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.319228 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3db10b2f-00c7-4d42-b997-83452eae36da","Type":"ContainerDied","Data":"eb6dfa6f30213c718e2bad2ce76eae4574d5ddb530288b90eb7c6b57c68a0b3c"} Apr 16 19:31:39.325802 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.325780 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7f49bcb978-2xkc2"] Apr 16 19:31:39.328940 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.328919 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.331594 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.331577 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 19:31:39.331716 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.331691 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 19:31:39.331716 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.331705 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 19:31:39.331848 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.331827 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 19:31:39.331920 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.331824 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-45l9n7at9cedb\"" Apr 16 19:31:39.331920 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.331877 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-9ct4h\"" Apr 16 19:31:39.339494 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.339475 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7f49bcb978-2xkc2"] Apr 16 19:31:39.367408 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.367375 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b207730a-c106-4c00-a949-9facc93e95b6-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.367726 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.367623 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/b207730a-c106-4c00-a949-9facc93e95b6-metrics-server-audit-profiles\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.368491 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.368177 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b207730a-c106-4c00-a949-9facc93e95b6-client-ca-bundle\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.369867 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.368883 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/b207730a-c106-4c00-a949-9facc93e95b6-secret-metrics-server-client-certs\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.369867 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.368926 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/b207730a-c106-4c00-a949-9facc93e95b6-secret-metrics-server-tls\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.369867 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.368970 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffqwz\" (UniqueName: \"kubernetes.io/projected/b207730a-c106-4c00-a949-9facc93e95b6-kube-api-access-ffqwz\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.369867 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.369142 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/b207730a-c106-4c00-a949-9facc93e95b6-audit-log\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.471098 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.471067 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/b207730a-c106-4c00-a949-9facc93e95b6-metrics-server-audit-profiles\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.471253 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.471103 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b207730a-c106-4c00-a949-9facc93e95b6-client-ca-bundle\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.471253 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.471125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/b207730a-c106-4c00-a949-9facc93e95b6-secret-metrics-server-client-certs\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.471253 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.471141 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/b207730a-c106-4c00-a949-9facc93e95b6-secret-metrics-server-tls\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.471253 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.471159 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffqwz\" (UniqueName: \"kubernetes.io/projected/b207730a-c106-4c00-a949-9facc93e95b6-kube-api-access-ffqwz\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.471253 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.471175 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/b207730a-c106-4c00-a949-9facc93e95b6-audit-log\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.471700 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.471666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b207730a-c106-4c00-a949-9facc93e95b6-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.471858 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.471833 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/b207730a-c106-4c00-a949-9facc93e95b6-audit-log\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.472242 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.472215 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/b207730a-c106-4c00-a949-9facc93e95b6-metrics-server-audit-profiles\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.472443 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.472424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b207730a-c106-4c00-a949-9facc93e95b6-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.473794 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.473771 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/b207730a-c106-4c00-a949-9facc93e95b6-secret-metrics-server-tls\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.473875 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.473793 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/b207730a-c106-4c00-a949-9facc93e95b6-secret-metrics-server-client-certs\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.473875 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.473821 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b207730a-c106-4c00-a949-9facc93e95b6-client-ca-bundle\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.483765 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.483743 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffqwz\" (UniqueName: \"kubernetes.io/projected/b207730a-c106-4c00-a949-9facc93e95b6-kube-api-access-ffqwz\") pod \"metrics-server-7f49bcb978-2xkc2\" (UID: \"b207730a-c106-4c00-a949-9facc93e95b6\") " pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.638386 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.638363 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:39.757853 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:39.757657 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7f49bcb978-2xkc2"] Apr 16 19:31:39.760619 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:31:39.760590 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb207730a_c106_4c00_a949_9facc93e95b6.slice/crio-f7a6081e31b2134deb88e9b5773f7a89c43b8923b65e3449ee113d116155d446 WatchSource:0}: Error finding container f7a6081e31b2134deb88e9b5773f7a89c43b8923b65e3449ee113d116155d446: Status 404 returned error can't find the container with id f7a6081e31b2134deb88e9b5773f7a89c43b8923b65e3449ee113d116155d446 Apr 16 19:31:40.323656 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:40.323517 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" event={"ID":"b207730a-c106-4c00-a949-9facc93e95b6","Type":"ContainerStarted","Data":"f7a6081e31b2134deb88e9b5773f7a89c43b8923b65e3449ee113d116155d446"} Apr 16 19:31:42.332981 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:42.332942 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3db10b2f-00c7-4d42-b997-83452eae36da","Type":"ContainerStarted","Data":"20924f2f1b8d33990b29e2554a63ce445d7814596cf9499aa96a07e5d73b0179"} Apr 16 19:31:42.332981 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:42.332986 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3db10b2f-00c7-4d42-b997-83452eae36da","Type":"ContainerStarted","Data":"b568201c506f722703d0e75fe5d00978999d441d613dcb08f1959bc8a885cfd7"} Apr 16 19:31:42.333401 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:42.332997 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3db10b2f-00c7-4d42-b997-83452eae36da","Type":"ContainerStarted","Data":"9c84f35041b1178772f2316c18250d9447a49490505ba942c2d65be49e8c2428"} Apr 16 19:31:42.333401 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:42.333006 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3db10b2f-00c7-4d42-b997-83452eae36da","Type":"ContainerStarted","Data":"d213bf2b6e3c315456bc245a018ca6d780b283b140e5fa8ce3367eafecf53b41"} Apr 16 19:31:42.333401 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:42.333018 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3db10b2f-00c7-4d42-b997-83452eae36da","Type":"ContainerStarted","Data":"905c93cd7c7616c2e8bde61769a8f54f2c2307673782d36c9e0e88078b4a35ce"} Apr 16 19:31:42.334190 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:42.334166 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" event={"ID":"b207730a-c106-4c00-a949-9facc93e95b6","Type":"ContainerStarted","Data":"e4d0f04b579af194a046b8b46afbe0bc882e11e989f650642c0c539b4899e868"} Apr 16 19:31:42.353100 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:42.352998 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" podStartSLOduration=1.7381366329999999 podStartE2EDuration="3.352979464s" podCreationTimestamp="2026-04-16 19:31:39 +0000 UTC" firstStartedPulling="2026-04-16 19:31:39.762625965 +0000 UTC m=+78.297640391" lastFinishedPulling="2026-04-16 19:31:41.377468795 +0000 UTC m=+79.912483222" observedRunningTime="2026-04-16 19:31:42.351496425 +0000 UTC m=+80.886510873" watchObservedRunningTime="2026-04-16 19:31:42.352979464 +0000 UTC m=+80.887993914" Apr 16 19:31:43.339713 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:43.339675 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3db10b2f-00c7-4d42-b997-83452eae36da","Type":"ContainerStarted","Data":"5984ff99320c483b8791c57e3181db58d5ec846a622e2c40fce0c926c61c8d39"} Apr 16 19:31:43.376915 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:43.376871 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.180110956 podStartE2EDuration="7.376856767s" podCreationTimestamp="2026-04-16 19:31:36 +0000 UTC" firstStartedPulling="2026-04-16 19:31:37.156526126 +0000 UTC m=+75.691540553" lastFinishedPulling="2026-04-16 19:31:42.353271928 +0000 UTC m=+80.888286364" observedRunningTime="2026-04-16 19:31:43.372925158 +0000 UTC m=+81.907939606" watchObservedRunningTime="2026-04-16 19:31:43.376856767 +0000 UTC m=+81.911871214" Apr 16 19:31:59.638987 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:59.638956 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:31:59.639390 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:31:59.639035 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:32:03.299728 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:03.299696 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-brgng" Apr 16 19:32:19.643939 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:19.643905 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:32:19.647850 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:19.647828 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7f49bcb978-2xkc2" Apr 16 19:32:55.322408 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:55.322364 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:32:55.322993 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:55.322945 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="alertmanager" containerID="cri-o://905c93cd7c7616c2e8bde61769a8f54f2c2307673782d36c9e0e88078b4a35ce" gracePeriod=120 Apr 16 19:32:55.323084 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:55.322977 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="kube-rbac-proxy-metric" containerID="cri-o://20924f2f1b8d33990b29e2554a63ce445d7814596cf9499aa96a07e5d73b0179" gracePeriod=120 Apr 16 19:32:55.323147 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:55.323034 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="kube-rbac-proxy-web" containerID="cri-o://9c84f35041b1178772f2316c18250d9447a49490505ba942c2d65be49e8c2428" gracePeriod=120 Apr 16 19:32:55.323147 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:55.323113 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="prom-label-proxy" containerID="cri-o://5984ff99320c483b8791c57e3181db58d5ec846a622e2c40fce0c926c61c8d39" gracePeriod=120 Apr 16 19:32:55.323147 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:55.323097 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="kube-rbac-proxy" containerID="cri-o://b568201c506f722703d0e75fe5d00978999d441d613dcb08f1959bc8a885cfd7" gracePeriod=120 Apr 16 19:32:55.323413 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:55.323356 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="config-reloader" containerID="cri-o://d213bf2b6e3c315456bc245a018ca6d780b283b140e5fa8ce3367eafecf53b41" gracePeriod=120 Apr 16 19:32:55.530144 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:55.530116 2575 generic.go:358] "Generic (PLEG): container finished" podID="3db10b2f-00c7-4d42-b997-83452eae36da" containerID="5984ff99320c483b8791c57e3181db58d5ec846a622e2c40fce0c926c61c8d39" exitCode=0 Apr 16 19:32:55.530144 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:55.530140 2575 generic.go:358] "Generic (PLEG): container finished" podID="3db10b2f-00c7-4d42-b997-83452eae36da" containerID="b568201c506f722703d0e75fe5d00978999d441d613dcb08f1959bc8a885cfd7" exitCode=0 Apr 16 19:32:55.530144 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:55.530147 2575 generic.go:358] "Generic (PLEG): container finished" podID="3db10b2f-00c7-4d42-b997-83452eae36da" containerID="d213bf2b6e3c315456bc245a018ca6d780b283b140e5fa8ce3367eafecf53b41" exitCode=0 Apr 16 19:32:55.530341 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:55.530155 2575 generic.go:358] "Generic (PLEG): container finished" podID="3db10b2f-00c7-4d42-b997-83452eae36da" containerID="905c93cd7c7616c2e8bde61769a8f54f2c2307673782d36c9e0e88078b4a35ce" exitCode=0 Apr 16 19:32:55.530341 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:55.530185 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3db10b2f-00c7-4d42-b997-83452eae36da","Type":"ContainerDied","Data":"5984ff99320c483b8791c57e3181db58d5ec846a622e2c40fce0c926c61c8d39"} Apr 16 19:32:55.530341 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:55.530214 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3db10b2f-00c7-4d42-b997-83452eae36da","Type":"ContainerDied","Data":"b568201c506f722703d0e75fe5d00978999d441d613dcb08f1959bc8a885cfd7"} Apr 16 19:32:55.530341 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:55.530223 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3db10b2f-00c7-4d42-b997-83452eae36da","Type":"ContainerDied","Data":"d213bf2b6e3c315456bc245a018ca6d780b283b140e5fa8ce3367eafecf53b41"} Apr 16 19:32:55.530341 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:55.530234 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3db10b2f-00c7-4d42-b997-83452eae36da","Type":"ContainerDied","Data":"905c93cd7c7616c2e8bde61769a8f54f2c2307673782d36c9e0e88078b4a35ce"} Apr 16 19:32:56.536629 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.536593 2575 generic.go:358] "Generic (PLEG): container finished" podID="3db10b2f-00c7-4d42-b997-83452eae36da" containerID="20924f2f1b8d33990b29e2554a63ce445d7814596cf9499aa96a07e5d73b0179" exitCode=0 Apr 16 19:32:56.536629 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.536625 2575 generic.go:358] "Generic (PLEG): container finished" podID="3db10b2f-00c7-4d42-b997-83452eae36da" containerID="9c84f35041b1178772f2316c18250d9447a49490505ba942c2d65be49e8c2428" exitCode=0 Apr 16 19:32:56.536976 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.536620 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3db10b2f-00c7-4d42-b997-83452eae36da","Type":"ContainerDied","Data":"20924f2f1b8d33990b29e2554a63ce445d7814596cf9499aa96a07e5d73b0179"} Apr 16 19:32:56.536976 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.536664 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3db10b2f-00c7-4d42-b997-83452eae36da","Type":"ContainerDied","Data":"9c84f35041b1178772f2316c18250d9447a49490505ba942c2d65be49e8c2428"} Apr 16 19:32:56.551601 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.551581 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:56.591943 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.591911 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-cluster-tls-config\") pod \"3db10b2f-00c7-4d42-b997-83452eae36da\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " Apr 16 19:32:56.592080 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.591970 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-kube-rbac-proxy-metric\") pod \"3db10b2f-00c7-4d42-b997-83452eae36da\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " Apr 16 19:32:56.592080 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.592009 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3db10b2f-00c7-4d42-b997-83452eae36da-tls-assets\") pod \"3db10b2f-00c7-4d42-b997-83452eae36da\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " Apr 16 19:32:56.592080 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.592036 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3db10b2f-00c7-4d42-b997-83452eae36da-alertmanager-trusted-ca-bundle\") pod \"3db10b2f-00c7-4d42-b997-83452eae36da\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " Apr 16 19:32:56.592341 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.592113 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3db10b2f-00c7-4d42-b997-83452eae36da-metrics-client-ca\") pod \"3db10b2f-00c7-4d42-b997-83452eae36da\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " Apr 16 19:32:56.592341 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.592156 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-kube-rbac-proxy\") pod \"3db10b2f-00c7-4d42-b997-83452eae36da\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " Apr 16 19:32:56.592341 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.592185 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-config-volume\") pod \"3db10b2f-00c7-4d42-b997-83452eae36da\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " Apr 16 19:32:56.592341 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.592220 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlr7z\" (UniqueName: \"kubernetes.io/projected/3db10b2f-00c7-4d42-b997-83452eae36da-kube-api-access-dlr7z\") pod \"3db10b2f-00c7-4d42-b997-83452eae36da\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " Apr 16 19:32:56.592341 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.592259 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-kube-rbac-proxy-web\") pod \"3db10b2f-00c7-4d42-b997-83452eae36da\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " Apr 16 19:32:56.592341 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.592287 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3db10b2f-00c7-4d42-b997-83452eae36da-config-out\") pod \"3db10b2f-00c7-4d42-b997-83452eae36da\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " Apr 16 19:32:56.592341 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.592310 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3db10b2f-00c7-4d42-b997-83452eae36da-alertmanager-main-db\") pod \"3db10b2f-00c7-4d42-b997-83452eae36da\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " Apr 16 19:32:56.592341 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.592341 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-main-tls\") pod \"3db10b2f-00c7-4d42-b997-83452eae36da\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " Apr 16 19:32:56.592803 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.592771 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3db10b2f-00c7-4d42-b997-83452eae36da-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "3db10b2f-00c7-4d42-b997-83452eae36da" (UID: "3db10b2f-00c7-4d42-b997-83452eae36da"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:32:56.593268 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.593242 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3db10b2f-00c7-4d42-b997-83452eae36da-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "3db10b2f-00c7-4d42-b997-83452eae36da" (UID: "3db10b2f-00c7-4d42-b997-83452eae36da"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:32:56.594111 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.594083 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3db10b2f-00c7-4d42-b997-83452eae36da-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "3db10b2f-00c7-4d42-b997-83452eae36da" (UID: "3db10b2f-00c7-4d42-b997-83452eae36da"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:32:56.594766 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.594728 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "3db10b2f-00c7-4d42-b997-83452eae36da" (UID: "3db10b2f-00c7-4d42-b997-83452eae36da"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:32:56.595437 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.595401 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "3db10b2f-00c7-4d42-b997-83452eae36da" (UID: "3db10b2f-00c7-4d42-b997-83452eae36da"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:32:56.595835 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.595805 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db10b2f-00c7-4d42-b997-83452eae36da-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "3db10b2f-00c7-4d42-b997-83452eae36da" (UID: "3db10b2f-00c7-4d42-b997-83452eae36da"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:32:56.596872 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.596800 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-config-volume" (OuterVolumeSpecName: "config-volume") pod "3db10b2f-00c7-4d42-b997-83452eae36da" (UID: "3db10b2f-00c7-4d42-b997-83452eae36da"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:32:56.596967 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.596892 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "3db10b2f-00c7-4d42-b997-83452eae36da" (UID: "3db10b2f-00c7-4d42-b997-83452eae36da"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:32:56.597415 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.597385 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3db10b2f-00c7-4d42-b997-83452eae36da-config-out" (OuterVolumeSpecName: "config-out") pod "3db10b2f-00c7-4d42-b997-83452eae36da" (UID: "3db10b2f-00c7-4d42-b997-83452eae36da"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:32:56.597528 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.597446 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "3db10b2f-00c7-4d42-b997-83452eae36da" (UID: "3db10b2f-00c7-4d42-b997-83452eae36da"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:32:56.598200 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.598176 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db10b2f-00c7-4d42-b997-83452eae36da-kube-api-access-dlr7z" (OuterVolumeSpecName: "kube-api-access-dlr7z") pod "3db10b2f-00c7-4d42-b997-83452eae36da" (UID: "3db10b2f-00c7-4d42-b997-83452eae36da"). InnerVolumeSpecName "kube-api-access-dlr7z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:32:56.599618 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.599600 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "3db10b2f-00c7-4d42-b997-83452eae36da" (UID: "3db10b2f-00c7-4d42-b997-83452eae36da"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:32:56.692747 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.692685 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-web-config\") pod \"3db10b2f-00c7-4d42-b997-83452eae36da\" (UID: \"3db10b2f-00c7-4d42-b997-83452eae36da\") " Apr 16 19:32:56.692871 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.692856 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-133-241.ec2.internal\" DevicePath \"\"" Apr 16 19:32:56.692932 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.692878 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3db10b2f-00c7-4d42-b997-83452eae36da-config-out\") on node \"ip-10-0-133-241.ec2.internal\" DevicePath \"\"" Apr 16 19:32:56.692932 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.692893 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3db10b2f-00c7-4d42-b997-83452eae36da-alertmanager-main-db\") on node \"ip-10-0-133-241.ec2.internal\" DevicePath \"\"" Apr 16 19:32:56.692932 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.692907 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-main-tls\") on node \"ip-10-0-133-241.ec2.internal\" DevicePath \"\"" Apr 16 19:32:56.692932 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.692922 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-cluster-tls-config\") on node \"ip-10-0-133-241.ec2.internal\" DevicePath \"\"" Apr 16 19:32:56.693139 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.692938 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-133-241.ec2.internal\" DevicePath \"\"" Apr 16 19:32:56.693139 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.692953 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3db10b2f-00c7-4d42-b997-83452eae36da-tls-assets\") on node \"ip-10-0-133-241.ec2.internal\" DevicePath \"\"" Apr 16 19:32:56.693139 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.692966 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3db10b2f-00c7-4d42-b997-83452eae36da-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-133-241.ec2.internal\" DevicePath \"\"" Apr 16 19:32:56.693139 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.692980 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3db10b2f-00c7-4d42-b997-83452eae36da-metrics-client-ca\") on node \"ip-10-0-133-241.ec2.internal\" DevicePath \"\"" Apr 16 19:32:56.693139 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.692994 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-133-241.ec2.internal\" DevicePath \"\"" Apr 16 19:32:56.693139 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.693008 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-config-volume\") on node \"ip-10-0-133-241.ec2.internal\" DevicePath \"\"" Apr 16 19:32:56.693139 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.693022 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dlr7z\" (UniqueName: \"kubernetes.io/projected/3db10b2f-00c7-4d42-b997-83452eae36da-kube-api-access-dlr7z\") on node \"ip-10-0-133-241.ec2.internal\" DevicePath \"\"" Apr 16 19:32:56.702361 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.702334 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-web-config" (OuterVolumeSpecName: "web-config") pod "3db10b2f-00c7-4d42-b997-83452eae36da" (UID: "3db10b2f-00c7-4d42-b997-83452eae36da"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:32:56.793270 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:56.793247 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3db10b2f-00c7-4d42-b997-83452eae36da-web-config\") on node \"ip-10-0-133-241.ec2.internal\" DevicePath \"\"" Apr 16 19:32:57.541858 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.541822 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3db10b2f-00c7-4d42-b997-83452eae36da","Type":"ContainerDied","Data":"fa3791aeab961b5ee95f8c5bab9459aeefb330c7309326efdb396eb2bb43b900"} Apr 16 19:32:57.541858 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.541862 2575 scope.go:117] "RemoveContainer" containerID="5984ff99320c483b8791c57e3181db58d5ec846a622e2c40fce0c926c61c8d39" Apr 16 19:32:57.542283 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.541892 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.549534 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.549471 2575 scope.go:117] "RemoveContainer" containerID="20924f2f1b8d33990b29e2554a63ce445d7814596cf9499aa96a07e5d73b0179" Apr 16 19:32:57.555753 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.555734 2575 scope.go:117] "RemoveContainer" containerID="b568201c506f722703d0e75fe5d00978999d441d613dcb08f1959bc8a885cfd7" Apr 16 19:32:57.561513 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.561499 2575 scope.go:117] "RemoveContainer" containerID="9c84f35041b1178772f2316c18250d9447a49490505ba942c2d65be49e8c2428" Apr 16 19:32:57.564800 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.564779 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:32:57.567881 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.567866 2575 scope.go:117] "RemoveContainer" containerID="d213bf2b6e3c315456bc245a018ca6d780b283b140e5fa8ce3367eafecf53b41" Apr 16 19:32:57.569094 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.569070 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:32:57.574114 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.574098 2575 scope.go:117] "RemoveContainer" containerID="905c93cd7c7616c2e8bde61769a8f54f2c2307673782d36c9e0e88078b4a35ce" Apr 16 19:32:57.580005 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.579990 2575 scope.go:117] "RemoveContainer" containerID="eb6dfa6f30213c718e2bad2ce76eae4574d5ddb530288b90eb7c6b57c68a0b3c" Apr 16 19:32:57.597415 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597395 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:32:57.597705 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597685 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="init-config-reloader" Apr 16 19:32:57.597782 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597708 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="init-config-reloader" Apr 16 19:32:57.597782 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597720 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="kube-rbac-proxy-web" Apr 16 19:32:57.597782 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597728 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="kube-rbac-proxy-web" Apr 16 19:32:57.597782 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597736 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="prom-label-proxy" Apr 16 19:32:57.597782 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597744 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="prom-label-proxy" Apr 16 19:32:57.597782 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597760 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="kube-rbac-proxy-metric" Apr 16 19:32:57.597782 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597768 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="kube-rbac-proxy-metric" Apr 16 19:32:57.597782 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597779 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="alertmanager" Apr 16 19:32:57.598186 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597788 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="alertmanager" Apr 16 19:32:57.598186 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597798 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="kube-rbac-proxy" Apr 16 19:32:57.598186 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597805 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="kube-rbac-proxy" Apr 16 19:32:57.598186 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597819 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="config-reloader" Apr 16 19:32:57.598186 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597827 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="config-reloader" Apr 16 19:32:57.598186 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597893 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="kube-rbac-proxy-web" Apr 16 19:32:57.598186 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597906 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="alertmanager" Apr 16 19:32:57.598186 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597916 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="kube-rbac-proxy" Apr 16 19:32:57.598186 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597925 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="kube-rbac-proxy-metric" Apr 16 19:32:57.598186 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597935 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="prom-label-proxy" Apr 16 19:32:57.598186 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.597944 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" containerName="config-reloader" Apr 16 19:32:57.602767 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.602749 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.605184 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.605164 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-mkswb\"" Apr 16 19:32:57.605278 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.605171 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 19:32:57.605278 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.605172 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 19:32:57.605391 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.605188 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 19:32:57.605391 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.605386 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 19:32:57.605683 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.605661 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 19:32:57.605683 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.605678 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 19:32:57.605877 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.605679 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 19:32:57.605877 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.605702 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 19:32:57.610789 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.610768 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 19:32:57.615728 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.615708 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:32:57.699096 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.699017 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.699096 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.699046 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.699096 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.699083 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.699269 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.699107 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.699269 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.699134 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0a646be-775c-46cf-a28a-25936d67d2a3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.699269 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.699153 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.699269 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.699216 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f0a646be-775c-46cf-a28a-25936d67d2a3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.699269 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.699246 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-web-config\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.699269 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.699266 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-config-volume\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.699465 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.699284 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f0a646be-775c-46cf-a28a-25936d67d2a3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.699465 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.699303 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6drs\" (UniqueName: \"kubernetes.io/projected/f0a646be-775c-46cf-a28a-25936d67d2a3-kube-api-access-m6drs\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.699465 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.699341 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f0a646be-775c-46cf-a28a-25936d67d2a3-config-out\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.699465 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.699354 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f0a646be-775c-46cf-a28a-25936d67d2a3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.800234 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.800212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f0a646be-775c-46cf-a28a-25936d67d2a3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.800325 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.800236 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6drs\" (UniqueName: \"kubernetes.io/projected/f0a646be-775c-46cf-a28a-25936d67d2a3-kube-api-access-m6drs\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.800325 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.800261 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f0a646be-775c-46cf-a28a-25936d67d2a3-config-out\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.800325 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.800277 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f0a646be-775c-46cf-a28a-25936d67d2a3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.800325 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.800312 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.800503 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.800353 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.800503 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.800383 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.800503 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.800406 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.800503 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.800445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0a646be-775c-46cf-a28a-25936d67d2a3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.800503 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.800474 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.800747 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.800524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f0a646be-775c-46cf-a28a-25936d67d2a3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.800747 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.800559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-web-config\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.800747 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.800588 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-config-volume\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.800747 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.800645 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f0a646be-775c-46cf-a28a-25936d67d2a3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.801557 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.801219 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f0a646be-775c-46cf-a28a-25936d67d2a3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.801557 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.801495 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0a646be-775c-46cf-a28a-25936d67d2a3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.803388 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.803358 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.803490 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.803387 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.803490 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.803418 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-config-volume\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.803490 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.803419 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.803811 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.803790 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f0a646be-775c-46cf-a28a-25936d67d2a3-config-out\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.804117 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.804099 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.804201 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.804184 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f0a646be-775c-46cf-a28a-25936d67d2a3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.804681 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.804658 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-web-config\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.805188 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.805173 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f0a646be-775c-46cf-a28a-25936d67d2a3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.809885 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.809864 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6drs\" (UniqueName: \"kubernetes.io/projected/f0a646be-775c-46cf-a28a-25936d67d2a3-kube-api-access-m6drs\") pod \"alertmanager-main-0\" (UID: \"f0a646be-775c-46cf-a28a-25936d67d2a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:57.912197 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:57.912170 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:32:58.019304 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:58.019276 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3db10b2f-00c7-4d42-b997-83452eae36da" path="/var/lib/kubelet/pods/3db10b2f-00c7-4d42-b997-83452eae36da/volumes" Apr 16 19:32:58.051274 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:58.051232 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:32:58.054435 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:32:58.054411 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0a646be_775c_46cf_a28a_25936d67d2a3.slice/crio-3d1eeba9324a1bae0b9dd9bef5f34075e2449f3e33de4e87cb3a9b05bab0ea99 WatchSource:0}: Error finding container 3d1eeba9324a1bae0b9dd9bef5f34075e2449f3e33de4e87cb3a9b05bab0ea99: Status 404 returned error can't find the container with id 3d1eeba9324a1bae0b9dd9bef5f34075e2449f3e33de4e87cb3a9b05bab0ea99 Apr 16 19:32:58.550629 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:58.550594 2575 generic.go:358] "Generic (PLEG): container finished" podID="f0a646be-775c-46cf-a28a-25936d67d2a3" containerID="729027c2900e93a2bcbe8e8c0ceb2189697c0bcecdae8b2e39ec4d9564f82b6d" exitCode=0 Apr 16 19:32:58.551099 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:58.550649 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f0a646be-775c-46cf-a28a-25936d67d2a3","Type":"ContainerDied","Data":"729027c2900e93a2bcbe8e8c0ceb2189697c0bcecdae8b2e39ec4d9564f82b6d"} Apr 16 19:32:58.551099 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:58.550675 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f0a646be-775c-46cf-a28a-25936d67d2a3","Type":"ContainerStarted","Data":"3d1eeba9324a1bae0b9dd9bef5f34075e2449f3e33de4e87cb3a9b05bab0ea99"} Apr 16 19:32:59.363842 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.363799 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-585d4fc4cf-m6m44"] Apr 16 19:32:59.366103 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.366089 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.368733 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.368710 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 19:32:59.368881 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.368743 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 19:32:59.368881 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.368779 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-b4f4p\"" Apr 16 19:32:59.368881 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.368712 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 19:32:59.368881 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.368710 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 19:32:59.369037 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.369025 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 19:32:59.374607 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.374590 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 19:32:59.381506 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.381487 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-585d4fc4cf-m6m44"] Apr 16 19:32:59.411759 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.411739 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/059796ae-ded3-4d75-a36b-9d22829219b5-serving-certs-ca-bundle\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.411849 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.411772 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/059796ae-ded3-4d75-a36b-9d22829219b5-federate-client-tls\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.411849 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.411791 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/059796ae-ded3-4d75-a36b-9d22829219b5-metrics-client-ca\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.411849 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.411823 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/059796ae-ded3-4d75-a36b-9d22829219b5-secret-telemeter-client\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.411949 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.411848 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/059796ae-ded3-4d75-a36b-9d22829219b5-telemeter-client-tls\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.411949 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.411894 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/059796ae-ded3-4d75-a36b-9d22829219b5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.411949 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.411918 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8472v\" (UniqueName: \"kubernetes.io/projected/059796ae-ded3-4d75-a36b-9d22829219b5-kube-api-access-8472v\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.411949 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.411936 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/059796ae-ded3-4d75-a36b-9d22829219b5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.512884 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.512863 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/059796ae-ded3-4d75-a36b-9d22829219b5-secret-telemeter-client\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.512951 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.512889 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/059796ae-ded3-4d75-a36b-9d22829219b5-telemeter-client-tls\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.512951 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.512913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/059796ae-ded3-4d75-a36b-9d22829219b5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.512951 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.512940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8472v\" (UniqueName: \"kubernetes.io/projected/059796ae-ded3-4d75-a36b-9d22829219b5-kube-api-access-8472v\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.513082 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.512971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/059796ae-ded3-4d75-a36b-9d22829219b5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.513082 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.513028 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/059796ae-ded3-4d75-a36b-9d22829219b5-serving-certs-ca-bundle\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.513164 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.513091 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/059796ae-ded3-4d75-a36b-9d22829219b5-federate-client-tls\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.513164 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.513122 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/059796ae-ded3-4d75-a36b-9d22829219b5-metrics-client-ca\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.513820 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.513795 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/059796ae-ded3-4d75-a36b-9d22829219b5-serving-certs-ca-bundle\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.513920 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.513835 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/059796ae-ded3-4d75-a36b-9d22829219b5-metrics-client-ca\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.513988 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.513936 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/059796ae-ded3-4d75-a36b-9d22829219b5-telemeter-trusted-ca-bundle\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.515469 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.515445 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/059796ae-ded3-4d75-a36b-9d22829219b5-telemeter-client-tls\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.515554 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.515493 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/059796ae-ded3-4d75-a36b-9d22829219b5-federate-client-tls\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.515846 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.515824 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/059796ae-ded3-4d75-a36b-9d22829219b5-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.515883 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.515872 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/059796ae-ded3-4d75-a36b-9d22829219b5-secret-telemeter-client\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.521355 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.521338 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8472v\" (UniqueName: \"kubernetes.io/projected/059796ae-ded3-4d75-a36b-9d22829219b5-kube-api-access-8472v\") pod \"telemeter-client-585d4fc4cf-m6m44\" (UID: \"059796ae-ded3-4d75-a36b-9d22829219b5\") " pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.555949 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.555923 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f0a646be-775c-46cf-a28a-25936d67d2a3","Type":"ContainerStarted","Data":"dbde2c47cd0888f24b25cc324273e23fa8f4170858d8d2748c2779178e9144ab"} Apr 16 19:32:59.556285 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.555952 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f0a646be-775c-46cf-a28a-25936d67d2a3","Type":"ContainerStarted","Data":"59e08bc86ce44226b30073d2cf0be380928dab5f4aceb222e5a380f0b906996b"} Apr 16 19:32:59.556285 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.555966 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f0a646be-775c-46cf-a28a-25936d67d2a3","Type":"ContainerStarted","Data":"16511f9abe7a12a7c8e5ee0c5c888e2b36dcff97ac2d9588b1bf02e4d1fa1a1a"} Apr 16 19:32:59.556285 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.555978 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f0a646be-775c-46cf-a28a-25936d67d2a3","Type":"ContainerStarted","Data":"06c715ba3fca6af7b4798bb3e3f6926726db61e6e2c37251a9f99a1715822b80"} Apr 16 19:32:59.556285 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.555990 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f0a646be-775c-46cf-a28a-25936d67d2a3","Type":"ContainerStarted","Data":"2de97d9bb487df8bfcc8e51c5944b3a1b001eb2ab52609aa2de3ef779479634e"} Apr 16 19:32:59.556285 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.556002 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f0a646be-775c-46cf-a28a-25936d67d2a3","Type":"ContainerStarted","Data":"f08912c4d22d72253fb289cb68207c486c14fc41e3957a1215d6406052fdceb4"} Apr 16 19:32:59.582751 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.582703 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.582691527 podStartE2EDuration="2.582691527s" podCreationTimestamp="2026-04-16 19:32:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:32:59.580700212 +0000 UTC m=+158.115714683" watchObservedRunningTime="2026-04-16 19:32:59.582691527 +0000 UTC m=+158.117705974" Apr 16 19:32:59.675177 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.675156 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" Apr 16 19:32:59.798028 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:32:59.797999 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-585d4fc4cf-m6m44"] Apr 16 19:32:59.800718 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:32:59.800689 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod059796ae_ded3_4d75_a36b_9d22829219b5.slice/crio-614f516f3cca88111dd7d48b26d328a31350c87739ae5ab40507f6fe56b33766 WatchSource:0}: Error finding container 614f516f3cca88111dd7d48b26d328a31350c87739ae5ab40507f6fe56b33766: Status 404 returned error can't find the container with id 614f516f3cca88111dd7d48b26d328a31350c87739ae5ab40507f6fe56b33766 Apr 16 19:33:00.567855 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:33:00.567804 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" event={"ID":"059796ae-ded3-4d75-a36b-9d22829219b5","Type":"ContainerStarted","Data":"614f516f3cca88111dd7d48b26d328a31350c87739ae5ab40507f6fe56b33766"} Apr 16 19:33:02.575646 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:33:02.575614 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" event={"ID":"059796ae-ded3-4d75-a36b-9d22829219b5","Type":"ContainerStarted","Data":"0381d1336e9c1ed33cae8f0f63b9b9f9c076b458b816b9d5e07fba52f5756d58"} Apr 16 19:33:02.576029 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:33:02.575652 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" event={"ID":"059796ae-ded3-4d75-a36b-9d22829219b5","Type":"ContainerStarted","Data":"8244eaf99ef1fa39137d9a9449a030c83a0b78ffcacc23d347ba2f93a3c31bed"} Apr 16 19:33:02.576029 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:33:02.575667 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" event={"ID":"059796ae-ded3-4d75-a36b-9d22829219b5","Type":"ContainerStarted","Data":"ed866a4a993a59cfdae2dd4d9d35d2747c8dbf9b7275eba9ec998d2fcf636f80"} Apr 16 19:33:02.598205 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:33:02.598158 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-585d4fc4cf-m6m44" podStartSLOduration=1.45142106 podStartE2EDuration="3.598146009s" podCreationTimestamp="2026-04-16 19:32:59 +0000 UTC" firstStartedPulling="2026-04-16 19:32:59.802471336 +0000 UTC m=+158.337485765" lastFinishedPulling="2026-04-16 19:33:01.949196288 +0000 UTC m=+160.484210714" observedRunningTime="2026-04-16 19:33:02.596482082 +0000 UTC m=+161.131496541" watchObservedRunningTime="2026-04-16 19:33:02.598146009 +0000 UTC m=+161.133160457" Apr 16 19:34:38.490100 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:34:38.490067 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7bgmg"] Apr 16 19:34:38.493176 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:34:38.493160 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgmg" Apr 16 19:34:38.496635 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:34:38.496616 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:34:38.505154 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:34:38.503216 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7bgmg"] Apr 16 19:34:38.520499 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:34:38.520475 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75450654-f95b-4a01-aacb-c02e80738893-dbus\") pod \"global-pull-secret-syncer-7bgmg\" (UID: \"75450654-f95b-4a01-aacb-c02e80738893\") " pod="kube-system/global-pull-secret-syncer-7bgmg" Apr 16 19:34:38.520587 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:34:38.520524 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75450654-f95b-4a01-aacb-c02e80738893-original-pull-secret\") pod \"global-pull-secret-syncer-7bgmg\" (UID: \"75450654-f95b-4a01-aacb-c02e80738893\") " pod="kube-system/global-pull-secret-syncer-7bgmg" Apr 16 19:34:38.520587 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:34:38.520552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75450654-f95b-4a01-aacb-c02e80738893-kubelet-config\") pod \"global-pull-secret-syncer-7bgmg\" (UID: \"75450654-f95b-4a01-aacb-c02e80738893\") " pod="kube-system/global-pull-secret-syncer-7bgmg" Apr 16 19:34:38.621779 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:34:38.621752 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75450654-f95b-4a01-aacb-c02e80738893-dbus\") pod \"global-pull-secret-syncer-7bgmg\" (UID: \"75450654-f95b-4a01-aacb-c02e80738893\") " pod="kube-system/global-pull-secret-syncer-7bgmg" Apr 16 19:34:38.621860 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:34:38.621799 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75450654-f95b-4a01-aacb-c02e80738893-original-pull-secret\") pod \"global-pull-secret-syncer-7bgmg\" (UID: \"75450654-f95b-4a01-aacb-c02e80738893\") " pod="kube-system/global-pull-secret-syncer-7bgmg" Apr 16 19:34:38.621939 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:34:38.621920 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75450654-f95b-4a01-aacb-c02e80738893-kubelet-config\") pod \"global-pull-secret-syncer-7bgmg\" (UID: \"75450654-f95b-4a01-aacb-c02e80738893\") " pod="kube-system/global-pull-secret-syncer-7bgmg" Apr 16 19:34:38.621974 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:34:38.621945 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75450654-f95b-4a01-aacb-c02e80738893-dbus\") pod \"global-pull-secret-syncer-7bgmg\" (UID: \"75450654-f95b-4a01-aacb-c02e80738893\") " pod="kube-system/global-pull-secret-syncer-7bgmg" Apr 16 19:34:38.622036 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:34:38.622017 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75450654-f95b-4a01-aacb-c02e80738893-kubelet-config\") pod \"global-pull-secret-syncer-7bgmg\" (UID: \"75450654-f95b-4a01-aacb-c02e80738893\") " pod="kube-system/global-pull-secret-syncer-7bgmg" Apr 16 19:34:38.623922 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:34:38.623906 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75450654-f95b-4a01-aacb-c02e80738893-original-pull-secret\") pod \"global-pull-secret-syncer-7bgmg\" (UID: \"75450654-f95b-4a01-aacb-c02e80738893\") " pod="kube-system/global-pull-secret-syncer-7bgmg" Apr 16 19:34:38.806545 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:34:38.806480 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7bgmg" Apr 16 19:34:38.921807 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:34:38.921779 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7bgmg"] Apr 16 19:34:38.925091 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:34:38.925037 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75450654_f95b_4a01_aacb_c02e80738893.slice/crio-a89a0bac4128553fc8f677118acc7c88322eceb2020a5eaa73d6223aabccc601 WatchSource:0}: Error finding container a89a0bac4128553fc8f677118acc7c88322eceb2020a5eaa73d6223aabccc601: Status 404 returned error can't find the container with id a89a0bac4128553fc8f677118acc7c88322eceb2020a5eaa73d6223aabccc601 Apr 16 19:34:39.823078 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:34:39.823024 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7bgmg" event={"ID":"75450654-f95b-4a01-aacb-c02e80738893","Type":"ContainerStarted","Data":"a89a0bac4128553fc8f677118acc7c88322eceb2020a5eaa73d6223aabccc601"} Apr 16 19:34:43.834487 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:34:43.834455 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7bgmg" event={"ID":"75450654-f95b-4a01-aacb-c02e80738893","Type":"ContainerStarted","Data":"c647f5d5b59d168bb186b706a61d21b3cf56950c9a788fab0521a7c201ed7ab1"} Apr 16 19:34:43.849262 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:34:43.849220 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7bgmg" podStartSLOduration=1.959155344 podStartE2EDuration="5.84920612s" podCreationTimestamp="2026-04-16 19:34:38 +0000 UTC" firstStartedPulling="2026-04-16 19:34:38.926439459 +0000 UTC m=+257.461453884" lastFinishedPulling="2026-04-16 19:34:42.816490231 +0000 UTC m=+261.351504660" observedRunningTime="2026-04-16 19:34:43.84822836 +0000 UTC m=+262.383242809" watchObservedRunningTime="2026-04-16 19:34:43.84920612 +0000 UTC m=+262.384220568" Apr 16 19:35:21.906610 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:21.906580 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 19:35:30.687763 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:30.687730 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-8t5cc"] Apr 16 19:35:30.689926 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:30.689911 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-8t5cc" Apr 16 19:35:30.692739 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:30.692713 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 19:35:30.692870 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:30.692764 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-8qh8p\"" Apr 16 19:35:30.693828 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:30.693811 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 19:35:30.700432 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:30.700408 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-8t5cc"] Apr 16 19:35:30.787194 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:30.787157 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95e108fd-54c7-4360-956f-113a205de0aa-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-8t5cc\" (UID: \"95e108fd-54c7-4360-956f-113a205de0aa\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-8t5cc" Apr 16 19:35:30.787338 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:30.787210 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc78v\" (UniqueName: \"kubernetes.io/projected/95e108fd-54c7-4360-956f-113a205de0aa-kube-api-access-nc78v\") pod \"cert-manager-cainjector-8966b78d4-8t5cc\" (UID: \"95e108fd-54c7-4360-956f-113a205de0aa\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-8t5cc" Apr 16 19:35:30.887816 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:30.887788 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95e108fd-54c7-4360-956f-113a205de0aa-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-8t5cc\" (UID: \"95e108fd-54c7-4360-956f-113a205de0aa\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-8t5cc" Apr 16 19:35:30.887890 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:30.887835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nc78v\" (UniqueName: \"kubernetes.io/projected/95e108fd-54c7-4360-956f-113a205de0aa-kube-api-access-nc78v\") pod \"cert-manager-cainjector-8966b78d4-8t5cc\" (UID: \"95e108fd-54c7-4360-956f-113a205de0aa\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-8t5cc" Apr 16 19:35:30.896789 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:30.896761 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95e108fd-54c7-4360-956f-113a205de0aa-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-8t5cc\" (UID: \"95e108fd-54c7-4360-956f-113a205de0aa\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-8t5cc" Apr 16 19:35:30.896891 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:30.896848 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc78v\" (UniqueName: \"kubernetes.io/projected/95e108fd-54c7-4360-956f-113a205de0aa-kube-api-access-nc78v\") pod \"cert-manager-cainjector-8966b78d4-8t5cc\" (UID: \"95e108fd-54c7-4360-956f-113a205de0aa\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-8t5cc" Apr 16 19:35:30.998637 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:30.998581 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-8t5cc" Apr 16 19:35:31.112148 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:31.112116 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-8t5cc"] Apr 16 19:35:31.116419 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:35:31.116388 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95e108fd_54c7_4360_956f_113a205de0aa.slice/crio-40b16b23705e3ee5d5f45f6563cbbe94cb7cddaa1d8dc754401af01940dab4bd WatchSource:0}: Error finding container 40b16b23705e3ee5d5f45f6563cbbe94cb7cddaa1d8dc754401af01940dab4bd: Status 404 returned error can't find the container with id 40b16b23705e3ee5d5f45f6563cbbe94cb7cddaa1d8dc754401af01940dab4bd Apr 16 19:35:31.118149 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:31.118131 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:35:31.957042 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:31.957008 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-8t5cc" event={"ID":"95e108fd-54c7-4360-956f-113a205de0aa","Type":"ContainerStarted","Data":"40b16b23705e3ee5d5f45f6563cbbe94cb7cddaa1d8dc754401af01940dab4bd"} Apr 16 19:35:34.972570 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:34.972531 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-8t5cc" event={"ID":"95e108fd-54c7-4360-956f-113a205de0aa","Type":"ContainerStarted","Data":"dad17cde82662f4c041c35659d8bae8650e05d541e815a0f7ef0047da1406b87"} Apr 16 19:35:35.017668 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:35.017626 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-8t5cc" podStartSLOduration=1.7425961559999998 podStartE2EDuration="5.017615408s" podCreationTimestamp="2026-04-16 19:35:30 +0000 UTC" firstStartedPulling="2026-04-16 19:35:31.118261248 +0000 UTC m=+309.653275673" lastFinishedPulling="2026-04-16 19:35:34.393280499 +0000 UTC m=+312.928294925" observedRunningTime="2026-04-16 19:35:35.017382944 +0000 UTC m=+313.552397395" watchObservedRunningTime="2026-04-16 19:35:35.017615408 +0000 UTC m=+313.552629856" Apr 16 19:35:47.122947 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:47.122917 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-wzqjg"] Apr 16 19:35:47.124962 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:47.124945 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-wzqjg" Apr 16 19:35:47.127631 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:47.127614 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-b7rk7\"" Apr 16 19:35:47.150029 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:47.150008 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-wzqjg"] Apr 16 19:35:47.301138 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:47.301109 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b44579e2-4800-4cb2-89bb-816d05069669-bound-sa-token\") pod \"cert-manager-759f64656b-wzqjg\" (UID: \"b44579e2-4800-4cb2-89bb-816d05069669\") " pod="cert-manager/cert-manager-759f64656b-wzqjg" Apr 16 19:35:47.301264 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:47.301148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txhgz\" (UniqueName: \"kubernetes.io/projected/b44579e2-4800-4cb2-89bb-816d05069669-kube-api-access-txhgz\") pod \"cert-manager-759f64656b-wzqjg\" (UID: \"b44579e2-4800-4cb2-89bb-816d05069669\") " pod="cert-manager/cert-manager-759f64656b-wzqjg" Apr 16 19:35:47.401962 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:47.401938 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b44579e2-4800-4cb2-89bb-816d05069669-bound-sa-token\") pod \"cert-manager-759f64656b-wzqjg\" (UID: \"b44579e2-4800-4cb2-89bb-816d05069669\") " pod="cert-manager/cert-manager-759f64656b-wzqjg" Apr 16 19:35:47.402082 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:47.401972 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-txhgz\" (UniqueName: \"kubernetes.io/projected/b44579e2-4800-4cb2-89bb-816d05069669-kube-api-access-txhgz\") pod \"cert-manager-759f64656b-wzqjg\" (UID: \"b44579e2-4800-4cb2-89bb-816d05069669\") " pod="cert-manager/cert-manager-759f64656b-wzqjg" Apr 16 19:35:47.409625 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:47.409604 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b44579e2-4800-4cb2-89bb-816d05069669-bound-sa-token\") pod \"cert-manager-759f64656b-wzqjg\" (UID: \"b44579e2-4800-4cb2-89bb-816d05069669\") " pod="cert-manager/cert-manager-759f64656b-wzqjg" Apr 16 19:35:47.409768 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:47.409751 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-txhgz\" (UniqueName: \"kubernetes.io/projected/b44579e2-4800-4cb2-89bb-816d05069669-kube-api-access-txhgz\") pod \"cert-manager-759f64656b-wzqjg\" (UID: \"b44579e2-4800-4cb2-89bb-816d05069669\") " pod="cert-manager/cert-manager-759f64656b-wzqjg" Apr 16 19:35:47.433727 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:47.433707 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-wzqjg" Apr 16 19:35:47.551776 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:47.551749 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-wzqjg"] Apr 16 19:35:47.554481 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:35:47.554454 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb44579e2_4800_4cb2_89bb_816d05069669.slice/crio-5742c519085fc41772027ec1948d6c8185de04e71c4411a4bf3091755707f75a WatchSource:0}: Error finding container 5742c519085fc41772027ec1948d6c8185de04e71c4411a4bf3091755707f75a: Status 404 returned error can't find the container with id 5742c519085fc41772027ec1948d6c8185de04e71c4411a4bf3091755707f75a Apr 16 19:35:48.011246 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:48.011207 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-wzqjg" event={"ID":"b44579e2-4800-4cb2-89bb-816d05069669","Type":"ContainerStarted","Data":"5b9faa1a2392066d2e9b8cd1c0fee20f5cdfa100616448247e6636d98ff98f3b"} Apr 16 19:35:48.011246 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:48.011247 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-wzqjg" event={"ID":"b44579e2-4800-4cb2-89bb-816d05069669","Type":"ContainerStarted","Data":"5742c519085fc41772027ec1948d6c8185de04e71c4411a4bf3091755707f75a"} Apr 16 19:35:48.026446 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:35:48.026399 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-wzqjg" podStartSLOduration=1.026386009 podStartE2EDuration="1.026386009s" podCreationTimestamp="2026-04-16 19:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:35:48.024869886 +0000 UTC m=+326.559884333" watchObservedRunningTime="2026-04-16 19:35:48.026386009 +0000 UTC m=+326.561400456" Apr 16 19:36:03.631361 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:03.631311 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j"] Apr 16 19:36:03.633562 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:03.633542 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j" Apr 16 19:36:03.637488 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:03.637460 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 19:36:03.637488 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:03.637477 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 19:36:03.637680 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:03.637528 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-vrbr4\"" Apr 16 19:36:03.637680 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:03.637474 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 19:36:03.637680 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:03.637591 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 19:36:03.651135 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:03.651116 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j"] Apr 16 19:36:03.717799 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:03.717774 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txzp6\" (UniqueName: \"kubernetes.io/projected/a2b9be98-fc85-4722-bac5-71d18e928abc-kube-api-access-txzp6\") pod \"opendatahub-operator-controller-manager-57586b9555-rbs2j\" (UID: \"a2b9be98-fc85-4722-bac5-71d18e928abc\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j" Apr 16 19:36:03.717906 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:03.717812 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2b9be98-fc85-4722-bac5-71d18e928abc-apiservice-cert\") pod \"opendatahub-operator-controller-manager-57586b9555-rbs2j\" (UID: \"a2b9be98-fc85-4722-bac5-71d18e928abc\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j" Apr 16 19:36:03.717906 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:03.717841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a2b9be98-fc85-4722-bac5-71d18e928abc-webhook-cert\") pod \"opendatahub-operator-controller-manager-57586b9555-rbs2j\" (UID: \"a2b9be98-fc85-4722-bac5-71d18e928abc\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j" Apr 16 19:36:03.818611 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:03.818588 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a2b9be98-fc85-4722-bac5-71d18e928abc-webhook-cert\") pod \"opendatahub-operator-controller-manager-57586b9555-rbs2j\" (UID: \"a2b9be98-fc85-4722-bac5-71d18e928abc\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j" Apr 16 19:36:03.818716 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:03.818646 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-txzp6\" (UniqueName: \"kubernetes.io/projected/a2b9be98-fc85-4722-bac5-71d18e928abc-kube-api-access-txzp6\") pod \"opendatahub-operator-controller-manager-57586b9555-rbs2j\" (UID: \"a2b9be98-fc85-4722-bac5-71d18e928abc\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j" Apr 16 19:36:03.818716 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:03.818673 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2b9be98-fc85-4722-bac5-71d18e928abc-apiservice-cert\") pod \"opendatahub-operator-controller-manager-57586b9555-rbs2j\" (UID: \"a2b9be98-fc85-4722-bac5-71d18e928abc\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j" Apr 16 19:36:03.821001 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:03.820983 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a2b9be98-fc85-4722-bac5-71d18e928abc-webhook-cert\") pod \"opendatahub-operator-controller-manager-57586b9555-rbs2j\" (UID: \"a2b9be98-fc85-4722-bac5-71d18e928abc\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j" Apr 16 19:36:03.821116 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:03.821020 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2b9be98-fc85-4722-bac5-71d18e928abc-apiservice-cert\") pod \"opendatahub-operator-controller-manager-57586b9555-rbs2j\" (UID: \"a2b9be98-fc85-4722-bac5-71d18e928abc\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j" Apr 16 19:36:03.826067 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:03.826037 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-txzp6\" (UniqueName: \"kubernetes.io/projected/a2b9be98-fc85-4722-bac5-71d18e928abc-kube-api-access-txzp6\") pod \"opendatahub-operator-controller-manager-57586b9555-rbs2j\" (UID: \"a2b9be98-fc85-4722-bac5-71d18e928abc\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j" Apr 16 19:36:03.944291 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:03.944270 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j" Apr 16 19:36:04.066150 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:04.066014 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j"] Apr 16 19:36:04.068841 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:36:04.068812 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2b9be98_fc85_4722_bac5_71d18e928abc.slice/crio-bdbaa0b67298f5e684f850b0561d3bf458b1eb930bbfb9f63a1861d4fb3fffe7 WatchSource:0}: Error finding container bdbaa0b67298f5e684f850b0561d3bf458b1eb930bbfb9f63a1861d4fb3fffe7: Status 404 returned error can't find the container with id bdbaa0b67298f5e684f850b0561d3bf458b1eb930bbfb9f63a1861d4fb3fffe7 Apr 16 19:36:05.059497 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:05.059453 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j" event={"ID":"a2b9be98-fc85-4722-bac5-71d18e928abc","Type":"ContainerStarted","Data":"bdbaa0b67298f5e684f850b0561d3bf458b1eb930bbfb9f63a1861d4fb3fffe7"} Apr 16 19:36:07.065944 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:07.065910 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j" event={"ID":"a2b9be98-fc85-4722-bac5-71d18e928abc","Type":"ContainerStarted","Data":"f6c95634038c98e408cb4fff7d92698dcd2d61f9c2faa802a32d1dbb1defc55f"} Apr 16 19:36:07.066323 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:07.066032 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j" Apr 16 19:36:07.087790 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:07.087750 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j" podStartSLOduration=1.618566812 podStartE2EDuration="4.087738477s" podCreationTimestamp="2026-04-16 19:36:03 +0000 UTC" firstStartedPulling="2026-04-16 19:36:04.070508252 +0000 UTC m=+342.605522685" lastFinishedPulling="2026-04-16 19:36:06.539679921 +0000 UTC m=+345.074694350" observedRunningTime="2026-04-16 19:36:07.085588883 +0000 UTC m=+345.620603344" watchObservedRunningTime="2026-04-16 19:36:07.087738477 +0000 UTC m=+345.622752983" Apr 16 19:36:13.149255 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.149170 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs"] Apr 16 19:36:13.152284 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.152263 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" Apr 16 19:36:13.155935 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.155900 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 19:36:13.156084 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.155903 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 19:36:13.156084 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.155904 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 19:36:13.156084 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.155906 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 19:36:13.156084 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.156040 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:36:13.156274 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.155913 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-9rnfn\"" Apr 16 19:36:13.160649 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.160630 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs"] Apr 16 19:36:13.188860 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.188840 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d16d646f-ae3a-44fb-a958-2091814d236f-manager-config\") pod \"lws-controller-manager-5bfdb756-4zkvs\" (UID: \"d16d646f-ae3a-44fb-a958-2091814d236f\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" Apr 16 19:36:13.188976 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.188898 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d16d646f-ae3a-44fb-a958-2091814d236f-metrics-cert\") pod \"lws-controller-manager-5bfdb756-4zkvs\" (UID: \"d16d646f-ae3a-44fb-a958-2091814d236f\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" Apr 16 19:36:13.188976 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.188946 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d16d646f-ae3a-44fb-a958-2091814d236f-cert\") pod \"lws-controller-manager-5bfdb756-4zkvs\" (UID: \"d16d646f-ae3a-44fb-a958-2091814d236f\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" Apr 16 19:36:13.189104 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.189083 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgpng\" (UniqueName: \"kubernetes.io/projected/d16d646f-ae3a-44fb-a958-2091814d236f-kube-api-access-fgpng\") pod \"lws-controller-manager-5bfdb756-4zkvs\" (UID: \"d16d646f-ae3a-44fb-a958-2091814d236f\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" Apr 16 19:36:13.290094 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.290047 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgpng\" (UniqueName: \"kubernetes.io/projected/d16d646f-ae3a-44fb-a958-2091814d236f-kube-api-access-fgpng\") pod \"lws-controller-manager-5bfdb756-4zkvs\" (UID: \"d16d646f-ae3a-44fb-a958-2091814d236f\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" Apr 16 19:36:13.290219 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.290110 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d16d646f-ae3a-44fb-a958-2091814d236f-manager-config\") pod \"lws-controller-manager-5bfdb756-4zkvs\" (UID: \"d16d646f-ae3a-44fb-a958-2091814d236f\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" Apr 16 19:36:13.290219 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.290145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d16d646f-ae3a-44fb-a958-2091814d236f-metrics-cert\") pod \"lws-controller-manager-5bfdb756-4zkvs\" (UID: \"d16d646f-ae3a-44fb-a958-2091814d236f\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" Apr 16 19:36:13.290219 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.290172 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d16d646f-ae3a-44fb-a958-2091814d236f-cert\") pod \"lws-controller-manager-5bfdb756-4zkvs\" (UID: \"d16d646f-ae3a-44fb-a958-2091814d236f\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" Apr 16 19:36:13.290702 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.290676 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d16d646f-ae3a-44fb-a958-2091814d236f-manager-config\") pod \"lws-controller-manager-5bfdb756-4zkvs\" (UID: \"d16d646f-ae3a-44fb-a958-2091814d236f\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" Apr 16 19:36:13.292481 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.292457 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/d16d646f-ae3a-44fb-a958-2091814d236f-metrics-cert\") pod \"lws-controller-manager-5bfdb756-4zkvs\" (UID: \"d16d646f-ae3a-44fb-a958-2091814d236f\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" Apr 16 19:36:13.292610 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.292594 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d16d646f-ae3a-44fb-a958-2091814d236f-cert\") pod \"lws-controller-manager-5bfdb756-4zkvs\" (UID: \"d16d646f-ae3a-44fb-a958-2091814d236f\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" Apr 16 19:36:13.302458 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.302439 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgpng\" (UniqueName: \"kubernetes.io/projected/d16d646f-ae3a-44fb-a958-2091814d236f-kube-api-access-fgpng\") pod \"lws-controller-manager-5bfdb756-4zkvs\" (UID: \"d16d646f-ae3a-44fb-a958-2091814d236f\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" Apr 16 19:36:13.461845 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.461758 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" Apr 16 19:36:13.580145 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:13.580119 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs"] Apr 16 19:36:13.582627 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:36:13.582601 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd16d646f_ae3a_44fb_a958_2091814d236f.slice/crio-7e534d3854e03bba70d86b4f465fd533a235d78ab61ef2cbf7615175d3c6ac51 WatchSource:0}: Error finding container 7e534d3854e03bba70d86b4f465fd533a235d78ab61ef2cbf7615175d3c6ac51: Status 404 returned error can't find the container with id 7e534d3854e03bba70d86b4f465fd533a235d78ab61ef2cbf7615175d3c6ac51 Apr 16 19:36:14.085236 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:14.085196 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" event={"ID":"d16d646f-ae3a-44fb-a958-2091814d236f","Type":"ContainerStarted","Data":"7e534d3854e03bba70d86b4f465fd533a235d78ab61ef2cbf7615175d3c6ac51"} Apr 16 19:36:17.094757 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:17.094716 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" event={"ID":"d16d646f-ae3a-44fb-a958-2091814d236f","Type":"ContainerStarted","Data":"49ddeea3fbc663730b6d1d7537d9c424716a62fe32120d1e17f397e0bdf45d11"} Apr 16 19:36:17.095144 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:17.094942 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" Apr 16 19:36:17.117145 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:17.117100 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" podStartSLOduration=1.5815870859999999 podStartE2EDuration="4.117087097s" podCreationTimestamp="2026-04-16 19:36:13 +0000 UTC" firstStartedPulling="2026-04-16 19:36:13.584264783 +0000 UTC m=+352.119279209" lastFinishedPulling="2026-04-16 19:36:16.119764794 +0000 UTC m=+354.654779220" observedRunningTime="2026-04-16 19:36:17.114534758 +0000 UTC m=+355.649549205" watchObservedRunningTime="2026-04-16 19:36:17.117087097 +0000 UTC m=+355.652101545" Apr 16 19:36:18.070899 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:18.070869 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-rbs2j" Apr 16 19:36:21.171548 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.171491 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-6b5579666b-tvzz2"] Apr 16 19:36:21.174859 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.174824 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6b5579666b-tvzz2" Apr 16 19:36:21.177302 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.177276 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 19:36:21.179398 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.179373 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 19:36:21.179491 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.179408 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 19:36:21.179491 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.179418 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 19:36:21.179491 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.179373 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-lfkhf\"" Apr 16 19:36:21.188314 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.188292 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6b5579666b-tvzz2"] Apr 16 19:36:21.257629 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.257603 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzgdh\" (UniqueName: \"kubernetes.io/projected/2e2baf79-53c1-430e-921b-4af11527cc75-kube-api-access-pzgdh\") pod \"kube-auth-proxy-6b5579666b-tvzz2\" (UID: \"2e2baf79-53c1-430e-921b-4af11527cc75\") " pod="openshift-ingress/kube-auth-proxy-6b5579666b-tvzz2" Apr 16 19:36:21.257756 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.257634 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e2baf79-53c1-430e-921b-4af11527cc75-tmp\") pod \"kube-auth-proxy-6b5579666b-tvzz2\" (UID: \"2e2baf79-53c1-430e-921b-4af11527cc75\") " pod="openshift-ingress/kube-auth-proxy-6b5579666b-tvzz2" Apr 16 19:36:21.257756 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.257663 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e2baf79-53c1-430e-921b-4af11527cc75-tls-certs\") pod \"kube-auth-proxy-6b5579666b-tvzz2\" (UID: \"2e2baf79-53c1-430e-921b-4af11527cc75\") " pod="openshift-ingress/kube-auth-proxy-6b5579666b-tvzz2" Apr 16 19:36:21.359108 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.359080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzgdh\" (UniqueName: \"kubernetes.io/projected/2e2baf79-53c1-430e-921b-4af11527cc75-kube-api-access-pzgdh\") pod \"kube-auth-proxy-6b5579666b-tvzz2\" (UID: \"2e2baf79-53c1-430e-921b-4af11527cc75\") " pod="openshift-ingress/kube-auth-proxy-6b5579666b-tvzz2" Apr 16 19:36:21.359108 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.359112 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e2baf79-53c1-430e-921b-4af11527cc75-tmp\") pod \"kube-auth-proxy-6b5579666b-tvzz2\" (UID: \"2e2baf79-53c1-430e-921b-4af11527cc75\") " pod="openshift-ingress/kube-auth-proxy-6b5579666b-tvzz2" Apr 16 19:36:21.359287 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.359136 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e2baf79-53c1-430e-921b-4af11527cc75-tls-certs\") pod \"kube-auth-proxy-6b5579666b-tvzz2\" (UID: \"2e2baf79-53c1-430e-921b-4af11527cc75\") " pod="openshift-ingress/kube-auth-proxy-6b5579666b-tvzz2" Apr 16 19:36:21.361510 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.361470 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e2baf79-53c1-430e-921b-4af11527cc75-tmp\") pod \"kube-auth-proxy-6b5579666b-tvzz2\" (UID: \"2e2baf79-53c1-430e-921b-4af11527cc75\") " pod="openshift-ingress/kube-auth-proxy-6b5579666b-tvzz2" Apr 16 19:36:21.361607 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.361583 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e2baf79-53c1-430e-921b-4af11527cc75-tls-certs\") pod \"kube-auth-proxy-6b5579666b-tvzz2\" (UID: \"2e2baf79-53c1-430e-921b-4af11527cc75\") " pod="openshift-ingress/kube-auth-proxy-6b5579666b-tvzz2" Apr 16 19:36:21.369541 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.369523 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzgdh\" (UniqueName: \"kubernetes.io/projected/2e2baf79-53c1-430e-921b-4af11527cc75-kube-api-access-pzgdh\") pod \"kube-auth-proxy-6b5579666b-tvzz2\" (UID: \"2e2baf79-53c1-430e-921b-4af11527cc75\") " pod="openshift-ingress/kube-auth-proxy-6b5579666b-tvzz2" Apr 16 19:36:21.485192 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.485119 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6b5579666b-tvzz2" Apr 16 19:36:21.620511 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:21.620479 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6b5579666b-tvzz2"] Apr 16 19:36:21.623949 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:36:21.623919 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e2baf79_53c1_430e_921b_4af11527cc75.slice/crio-2d7417cfb3b06f19df7dfc0a370cad03226eb4c460b3a6f07992d7e4687ec350 WatchSource:0}: Error finding container 2d7417cfb3b06f19df7dfc0a370cad03226eb4c460b3a6f07992d7e4687ec350: Status 404 returned error can't find the container with id 2d7417cfb3b06f19df7dfc0a370cad03226eb4c460b3a6f07992d7e4687ec350 Apr 16 19:36:22.111601 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:22.111568 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6b5579666b-tvzz2" event={"ID":"2e2baf79-53c1-430e-921b-4af11527cc75","Type":"ContainerStarted","Data":"2d7417cfb3b06f19df7dfc0a370cad03226eb4c460b3a6f07992d7e4687ec350"} Apr 16 19:36:24.889045 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:24.889018 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 19:36:25.122617 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:25.122579 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6b5579666b-tvzz2" event={"ID":"2e2baf79-53c1-430e-921b-4af11527cc75","Type":"ContainerStarted","Data":"8e7f3505a1d4b81487a750bd215c8a0c48905bdbc68cb4258a0aa92b1d9f215f"} Apr 16 19:36:25.155414 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:25.155361 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-6b5579666b-tvzz2" podStartSLOduration=0.896992865 podStartE2EDuration="4.155345559s" podCreationTimestamp="2026-04-16 19:36:21 +0000 UTC" firstStartedPulling="2026-04-16 19:36:21.625898435 +0000 UTC m=+360.160912862" lastFinishedPulling="2026-04-16 19:36:24.884251124 +0000 UTC m=+363.419265556" observedRunningTime="2026-04-16 19:36:25.154167096 +0000 UTC m=+363.689181547" watchObservedRunningTime="2026-04-16 19:36:25.155345559 +0000 UTC m=+363.690360042" Apr 16 19:36:28.102851 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:36:28.102820 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5bfdb756-4zkvs" Apr 16 19:38:13.971403 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:13.971365 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gvqs5"] Apr 16 19:38:13.973543 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:13.973526 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gvqs5" Apr 16 19:38:13.975972 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:13.975945 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 19:38:13.975972 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:13.975964 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 19:38:13.976245 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:13.976229 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 19:38:13.976306 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:13.976262 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 19:38:13.976973 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:13.976953 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-5m27b\"" Apr 16 19:38:13.982299 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:13.982273 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gvqs5"] Apr 16 19:38:14.104069 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:14.104034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/50652469-c81f-48dd-9c1e-a61da64cb5ab-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-gvqs5\" (UID: \"50652469-c81f-48dd-9c1e-a61da64cb5ab\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gvqs5" Apr 16 19:38:14.104231 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:14.104109 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/50652469-c81f-48dd-9c1e-a61da64cb5ab-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-gvqs5\" (UID: \"50652469-c81f-48dd-9c1e-a61da64cb5ab\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gvqs5" Apr 16 19:38:14.104231 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:14.104149 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzk67\" (UniqueName: \"kubernetes.io/projected/50652469-c81f-48dd-9c1e-a61da64cb5ab-kube-api-access-xzk67\") pod \"kuadrant-console-plugin-6cb54b5c86-gvqs5\" (UID: \"50652469-c81f-48dd-9c1e-a61da64cb5ab\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gvqs5" Apr 16 19:38:14.205394 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:14.205364 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/50652469-c81f-48dd-9c1e-a61da64cb5ab-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-gvqs5\" (UID: \"50652469-c81f-48dd-9c1e-a61da64cb5ab\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gvqs5" Apr 16 19:38:14.205531 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:14.205402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzk67\" (UniqueName: \"kubernetes.io/projected/50652469-c81f-48dd-9c1e-a61da64cb5ab-kube-api-access-xzk67\") pod \"kuadrant-console-plugin-6cb54b5c86-gvqs5\" (UID: \"50652469-c81f-48dd-9c1e-a61da64cb5ab\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gvqs5" Apr 16 19:38:14.205531 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:14.205456 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/50652469-c81f-48dd-9c1e-a61da64cb5ab-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-gvqs5\" (UID: \"50652469-c81f-48dd-9c1e-a61da64cb5ab\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gvqs5" Apr 16 19:38:14.205944 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:14.205923 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/50652469-c81f-48dd-9c1e-a61da64cb5ab-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-gvqs5\" (UID: \"50652469-c81f-48dd-9c1e-a61da64cb5ab\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gvqs5" Apr 16 19:38:14.207765 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:14.207741 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/50652469-c81f-48dd-9c1e-a61da64cb5ab-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-gvqs5\" (UID: \"50652469-c81f-48dd-9c1e-a61da64cb5ab\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gvqs5" Apr 16 19:38:14.219835 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:14.217482 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzk67\" (UniqueName: \"kubernetes.io/projected/50652469-c81f-48dd-9c1e-a61da64cb5ab-kube-api-access-xzk67\") pod \"kuadrant-console-plugin-6cb54b5c86-gvqs5\" (UID: \"50652469-c81f-48dd-9c1e-a61da64cb5ab\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gvqs5" Apr 16 19:38:14.283171 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:14.283116 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gvqs5" Apr 16 19:38:14.411927 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:14.411894 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gvqs5"] Apr 16 19:38:14.415203 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:38:14.415176 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50652469_c81f_48dd_9c1e_a61da64cb5ab.slice/crio-d8a8687ed223d13a96211cb5a27741ef1ba8022c09d980955150fed027669b26 WatchSource:0}: Error finding container d8a8687ed223d13a96211cb5a27741ef1ba8022c09d980955150fed027669b26: Status 404 returned error can't find the container with id d8a8687ed223d13a96211cb5a27741ef1ba8022c09d980955150fed027669b26 Apr 16 19:38:14.445412 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:14.445382 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gvqs5" event={"ID":"50652469-c81f-48dd-9c1e-a61da64cb5ab","Type":"ContainerStarted","Data":"d8a8687ed223d13a96211cb5a27741ef1ba8022c09d980955150fed027669b26"} Apr 16 19:38:38.540510 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:38.540468 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gvqs5" event={"ID":"50652469-c81f-48dd-9c1e-a61da64cb5ab","Type":"ContainerStarted","Data":"70097b3979f589879cb2ce9a6e2709a12521354ece49f6ee6af4760ed1e2a4f0"} Apr 16 19:38:38.558517 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:38:38.558471 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gvqs5" podStartSLOduration=2.175075698 podStartE2EDuration="25.558459389s" podCreationTimestamp="2026-04-16 19:38:13 +0000 UTC" firstStartedPulling="2026-04-16 19:38:14.41680982 +0000 UTC m=+472.951824246" lastFinishedPulling="2026-04-16 19:38:37.800193511 +0000 UTC m=+496.335207937" observedRunningTime="2026-04-16 19:38:38.556979544 +0000 UTC m=+497.091993992" watchObservedRunningTime="2026-04-16 19:38:38.558459389 +0000 UTC m=+497.093473839" Apr 16 19:39:01.329147 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:01.329109 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8shkt"] Apr 16 19:39:01.362606 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:01.362576 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8shkt"] Apr 16 19:39:01.362781 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:01.362693 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-8shkt" Apr 16 19:39:01.365272 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:01.365251 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-892hz\"" Apr 16 19:39:01.497398 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:01.497372 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h8d2\" (UniqueName: \"kubernetes.io/projected/c2f0df59-06cb-4bc3-b268-2ae96515611d-kube-api-access-2h8d2\") pod \"authorino-f99f4b5cd-8shkt\" (UID: \"c2f0df59-06cb-4bc3-b268-2ae96515611d\") " pod="kuadrant-system/authorino-f99f4b5cd-8shkt" Apr 16 19:39:01.598384 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:01.598313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2h8d2\" (UniqueName: \"kubernetes.io/projected/c2f0df59-06cb-4bc3-b268-2ae96515611d-kube-api-access-2h8d2\") pod \"authorino-f99f4b5cd-8shkt\" (UID: \"c2f0df59-06cb-4bc3-b268-2ae96515611d\") " pod="kuadrant-system/authorino-f99f4b5cd-8shkt" Apr 16 19:39:01.609658 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:01.609637 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h8d2\" (UniqueName: \"kubernetes.io/projected/c2f0df59-06cb-4bc3-b268-2ae96515611d-kube-api-access-2h8d2\") pod \"authorino-f99f4b5cd-8shkt\" (UID: \"c2f0df59-06cb-4bc3-b268-2ae96515611d\") " pod="kuadrant-system/authorino-f99f4b5cd-8shkt" Apr 16 19:39:01.670975 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:01.670942 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-8shkt" Apr 16 19:39:01.782482 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:01.782460 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8shkt"] Apr 16 19:39:01.785068 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:39:01.785026 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2f0df59_06cb_4bc3_b268_2ae96515611d.slice/crio-f021a6ae4a5f9890ebb58068af2616b52cd26fda6f4a6bd96c9c116b7d2942c4 WatchSource:0}: Error finding container f021a6ae4a5f9890ebb58068af2616b52cd26fda6f4a6bd96c9c116b7d2942c4: Status 404 returned error can't find the container with id f021a6ae4a5f9890ebb58068af2616b52cd26fda6f4a6bd96c9c116b7d2942c4 Apr 16 19:39:02.616580 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:02.616541 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-8shkt" event={"ID":"c2f0df59-06cb-4bc3-b268-2ae96515611d","Type":"ContainerStarted","Data":"f021a6ae4a5f9890ebb58068af2616b52cd26fda6f4a6bd96c9c116b7d2942c4"} Apr 16 19:39:04.624518 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:04.624439 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-8shkt" event={"ID":"c2f0df59-06cb-4bc3-b268-2ae96515611d","Type":"ContainerStarted","Data":"756bd6f981b4d4edb2adda66909607c70dcc82ce7bd5316e9da6db1821086fe7"} Apr 16 19:39:04.640190 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:04.640147 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-8shkt" podStartSLOduration=1.20143024 podStartE2EDuration="3.640133788s" podCreationTimestamp="2026-04-16 19:39:01 +0000 UTC" firstStartedPulling="2026-04-16 19:39:01.786272544 +0000 UTC m=+520.321286975" lastFinishedPulling="2026-04-16 19:39:04.224976098 +0000 UTC m=+522.759990523" observedRunningTime="2026-04-16 19:39:04.637672604 +0000 UTC m=+523.172687055" watchObservedRunningTime="2026-04-16 19:39:04.640133788 +0000 UTC m=+523.175148274" Apr 16 19:39:06.508652 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:06.508611 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8shkt"] Apr 16 19:39:06.630421 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:06.630355 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-8shkt" podUID="c2f0df59-06cb-4bc3-b268-2ae96515611d" containerName="authorino" containerID="cri-o://756bd6f981b4d4edb2adda66909607c70dcc82ce7bd5316e9da6db1821086fe7" gracePeriod=30 Apr 16 19:39:06.864379 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:06.864358 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-8shkt" Apr 16 19:39:06.938627 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:06.938596 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h8d2\" (UniqueName: \"kubernetes.io/projected/c2f0df59-06cb-4bc3-b268-2ae96515611d-kube-api-access-2h8d2\") pod \"c2f0df59-06cb-4bc3-b268-2ae96515611d\" (UID: \"c2f0df59-06cb-4bc3-b268-2ae96515611d\") " Apr 16 19:39:06.940667 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:06.940641 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f0df59-06cb-4bc3-b268-2ae96515611d-kube-api-access-2h8d2" (OuterVolumeSpecName: "kube-api-access-2h8d2") pod "c2f0df59-06cb-4bc3-b268-2ae96515611d" (UID: "c2f0df59-06cb-4bc3-b268-2ae96515611d"). InnerVolumeSpecName "kube-api-access-2h8d2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:39:07.039891 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:07.039860 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2h8d2\" (UniqueName: \"kubernetes.io/projected/c2f0df59-06cb-4bc3-b268-2ae96515611d-kube-api-access-2h8d2\") on node \"ip-10-0-133-241.ec2.internal\" DevicePath \"\"" Apr 16 19:39:07.634664 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:07.634618 2575 generic.go:358] "Generic (PLEG): container finished" podID="c2f0df59-06cb-4bc3-b268-2ae96515611d" containerID="756bd6f981b4d4edb2adda66909607c70dcc82ce7bd5316e9da6db1821086fe7" exitCode=0 Apr 16 19:39:07.634664 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:07.634668 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-8shkt" Apr 16 19:39:07.635143 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:07.634690 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-8shkt" event={"ID":"c2f0df59-06cb-4bc3-b268-2ae96515611d","Type":"ContainerDied","Data":"756bd6f981b4d4edb2adda66909607c70dcc82ce7bd5316e9da6db1821086fe7"} Apr 16 19:39:07.635143 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:07.634721 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-8shkt" event={"ID":"c2f0df59-06cb-4bc3-b268-2ae96515611d","Type":"ContainerDied","Data":"f021a6ae4a5f9890ebb58068af2616b52cd26fda6f4a6bd96c9c116b7d2942c4"} Apr 16 19:39:07.635143 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:07.634740 2575 scope.go:117] "RemoveContainer" containerID="756bd6f981b4d4edb2adda66909607c70dcc82ce7bd5316e9da6db1821086fe7" Apr 16 19:39:07.642698 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:07.642677 2575 scope.go:117] "RemoveContainer" containerID="756bd6f981b4d4edb2adda66909607c70dcc82ce7bd5316e9da6db1821086fe7" Apr 16 19:39:07.642948 ip-10-0-133-241 kubenswrapper[2575]: E0416 19:39:07.642931 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"756bd6f981b4d4edb2adda66909607c70dcc82ce7bd5316e9da6db1821086fe7\": container with ID starting with 756bd6f981b4d4edb2adda66909607c70dcc82ce7bd5316e9da6db1821086fe7 not found: ID does not exist" containerID="756bd6f981b4d4edb2adda66909607c70dcc82ce7bd5316e9da6db1821086fe7" Apr 16 19:39:07.642990 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:07.642955 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"756bd6f981b4d4edb2adda66909607c70dcc82ce7bd5316e9da6db1821086fe7"} err="failed to get container status \"756bd6f981b4d4edb2adda66909607c70dcc82ce7bd5316e9da6db1821086fe7\": rpc error: code = NotFound desc = could not find container \"756bd6f981b4d4edb2adda66909607c70dcc82ce7bd5316e9da6db1821086fe7\": container with ID starting with 756bd6f981b4d4edb2adda66909607c70dcc82ce7bd5316e9da6db1821086fe7 not found: ID does not exist" Apr 16 19:39:07.658029 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:07.658006 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8shkt"] Apr 16 19:39:07.659914 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:07.659894 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8shkt"] Apr 16 19:39:08.018750 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:39:08.018722 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f0df59-06cb-4bc3-b268-2ae96515611d" path="/var/lib/kubelet/pods/c2f0df59-06cb-4bc3-b268-2ae96515611d/volumes" Apr 16 19:40:20.432706 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:20.432672 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-86d84b9475-s65qk"] Apr 16 19:40:20.433162 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:20.432969 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2f0df59-06cb-4bc3-b268-2ae96515611d" containerName="authorino" Apr 16 19:40:20.433162 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:20.432980 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f0df59-06cb-4bc3-b268-2ae96515611d" containerName="authorino" Apr 16 19:40:20.433162 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:20.433035 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2f0df59-06cb-4bc3-b268-2ae96515611d" containerName="authorino" Apr 16 19:40:20.435806 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:20.435791 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-86d84b9475-s65qk" Apr 16 19:40:20.439179 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:20.439154 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 16 19:40:20.439179 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:20.439168 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-7fvpn\"" Apr 16 19:40:20.439179 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:20.439154 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 16 19:40:20.445527 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:20.445507 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-86d84b9475-s65qk"] Apr 16 19:40:20.554195 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:20.554165 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzprw\" (UniqueName: \"kubernetes.io/projected/1c67254d-f7ef-4df7-a36b-b57a51edc7f9-kube-api-access-mzprw\") pod \"maas-api-86d84b9475-s65qk\" (UID: \"1c67254d-f7ef-4df7-a36b-b57a51edc7f9\") " pod="opendatahub/maas-api-86d84b9475-s65qk" Apr 16 19:40:20.554346 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:20.554204 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1c67254d-f7ef-4df7-a36b-b57a51edc7f9-maas-api-tls\") pod \"maas-api-86d84b9475-s65qk\" (UID: \"1c67254d-f7ef-4df7-a36b-b57a51edc7f9\") " pod="opendatahub/maas-api-86d84b9475-s65qk" Apr 16 19:40:20.655107 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:20.655076 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzprw\" (UniqueName: \"kubernetes.io/projected/1c67254d-f7ef-4df7-a36b-b57a51edc7f9-kube-api-access-mzprw\") pod \"maas-api-86d84b9475-s65qk\" (UID: \"1c67254d-f7ef-4df7-a36b-b57a51edc7f9\") " pod="opendatahub/maas-api-86d84b9475-s65qk" Apr 16 19:40:20.655257 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:20.655116 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1c67254d-f7ef-4df7-a36b-b57a51edc7f9-maas-api-tls\") pod \"maas-api-86d84b9475-s65qk\" (UID: \"1c67254d-f7ef-4df7-a36b-b57a51edc7f9\") " pod="opendatahub/maas-api-86d84b9475-s65qk" Apr 16 19:40:20.657634 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:20.657601 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1c67254d-f7ef-4df7-a36b-b57a51edc7f9-maas-api-tls\") pod \"maas-api-86d84b9475-s65qk\" (UID: \"1c67254d-f7ef-4df7-a36b-b57a51edc7f9\") " pod="opendatahub/maas-api-86d84b9475-s65qk" Apr 16 19:40:20.663396 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:20.663368 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzprw\" (UniqueName: \"kubernetes.io/projected/1c67254d-f7ef-4df7-a36b-b57a51edc7f9-kube-api-access-mzprw\") pod \"maas-api-86d84b9475-s65qk\" (UID: \"1c67254d-f7ef-4df7-a36b-b57a51edc7f9\") " pod="opendatahub/maas-api-86d84b9475-s65qk" Apr 16 19:40:20.746385 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:20.746321 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-86d84b9475-s65qk" Apr 16 19:40:20.863636 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:20.863607 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-86d84b9475-s65qk"] Apr 16 19:40:20.866649 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:40:20.866620 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c67254d_f7ef_4df7_a36b_b57a51edc7f9.slice/crio-bc8739b81d343543dee51470cd390fc28427c257d83bd6143cbf66f1eceb8e0e WatchSource:0}: Error finding container bc8739b81d343543dee51470cd390fc28427c257d83bd6143cbf66f1eceb8e0e: Status 404 returned error can't find the container with id bc8739b81d343543dee51470cd390fc28427c257d83bd6143cbf66f1eceb8e0e Apr 16 19:40:21.862648 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:21.862597 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-86d84b9475-s65qk" event={"ID":"1c67254d-f7ef-4df7-a36b-b57a51edc7f9","Type":"ContainerStarted","Data":"bc8739b81d343543dee51470cd390fc28427c257d83bd6143cbf66f1eceb8e0e"} Apr 16 19:40:23.040338 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:23.040316 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 16 19:40:23.870836 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:23.870797 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-86d84b9475-s65qk" event={"ID":"1c67254d-f7ef-4df7-a36b-b57a51edc7f9","Type":"ContainerStarted","Data":"4a8957ae1e0bf21dfd89a5127c215d061308830c3562cd8829a772205017dccb"} Apr 16 19:40:23.871010 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:23.870924 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-86d84b9475-s65qk" Apr 16 19:40:23.889722 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:23.889672 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-86d84b9475-s65qk" podStartSLOduration=1.719842256 podStartE2EDuration="3.889655651s" podCreationTimestamp="2026-04-16 19:40:20 +0000 UTC" firstStartedPulling="2026-04-16 19:40:20.867804843 +0000 UTC m=+599.402819270" lastFinishedPulling="2026-04-16 19:40:23.037618234 +0000 UTC m=+601.572632665" observedRunningTime="2026-04-16 19:40:23.886565819 +0000 UTC m=+602.421580267" watchObservedRunningTime="2026-04-16 19:40:23.889655651 +0000 UTC m=+602.424670111" Apr 16 19:40:29.879470 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:40:29.879443 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-86d84b9475-s65qk" Apr 16 19:41:47.262566 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:41:47.262529 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-54749bfbbc-zgxb4"] Apr 16 19:41:47.272250 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:41:47.272224 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-54749bfbbc-zgxb4" Apr 16 19:41:47.274791 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:41:47.274765 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 19:41:47.275089 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:41:47.275067 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-892hz\"" Apr 16 19:41:47.275532 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:41:47.275514 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-54749bfbbc-zgxb4"] Apr 16 19:41:47.415327 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:41:47.415300 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/88f4eff8-1228-4bfa-ae67-22af8203f8f2-tls-cert\") pod \"authorino-54749bfbbc-zgxb4\" (UID: \"88f4eff8-1228-4bfa-ae67-22af8203f8f2\") " pod="kuadrant-system/authorino-54749bfbbc-zgxb4" Apr 16 19:41:47.415436 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:41:47.415346 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcggf\" (UniqueName: \"kubernetes.io/projected/88f4eff8-1228-4bfa-ae67-22af8203f8f2-kube-api-access-fcggf\") pod \"authorino-54749bfbbc-zgxb4\" (UID: \"88f4eff8-1228-4bfa-ae67-22af8203f8f2\") " pod="kuadrant-system/authorino-54749bfbbc-zgxb4" Apr 16 19:41:47.516309 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:41:47.516249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcggf\" (UniqueName: \"kubernetes.io/projected/88f4eff8-1228-4bfa-ae67-22af8203f8f2-kube-api-access-fcggf\") pod \"authorino-54749bfbbc-zgxb4\" (UID: \"88f4eff8-1228-4bfa-ae67-22af8203f8f2\") " pod="kuadrant-system/authorino-54749bfbbc-zgxb4" Apr 16 19:41:47.516309 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:41:47.516306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/88f4eff8-1228-4bfa-ae67-22af8203f8f2-tls-cert\") pod \"authorino-54749bfbbc-zgxb4\" (UID: \"88f4eff8-1228-4bfa-ae67-22af8203f8f2\") " pod="kuadrant-system/authorino-54749bfbbc-zgxb4" Apr 16 19:41:47.518553 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:41:47.518529 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/88f4eff8-1228-4bfa-ae67-22af8203f8f2-tls-cert\") pod \"authorino-54749bfbbc-zgxb4\" (UID: \"88f4eff8-1228-4bfa-ae67-22af8203f8f2\") " pod="kuadrant-system/authorino-54749bfbbc-zgxb4" Apr 16 19:41:47.524129 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:41:47.524103 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcggf\" (UniqueName: \"kubernetes.io/projected/88f4eff8-1228-4bfa-ae67-22af8203f8f2-kube-api-access-fcggf\") pod \"authorino-54749bfbbc-zgxb4\" (UID: \"88f4eff8-1228-4bfa-ae67-22af8203f8f2\") " pod="kuadrant-system/authorino-54749bfbbc-zgxb4" Apr 16 19:41:47.582897 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:41:47.582868 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-54749bfbbc-zgxb4" Apr 16 19:41:47.733959 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:41:47.733924 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-54749bfbbc-zgxb4"] Apr 16 19:41:47.741424 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:41:47.741390 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88f4eff8_1228_4bfa_ae67_22af8203f8f2.slice/crio-f77ff99260090144f33e7d98c02230ab1f5da35747019a255938171d64d9c8d5 WatchSource:0}: Error finding container f77ff99260090144f33e7d98c02230ab1f5da35747019a255938171d64d9c8d5: Status 404 returned error can't find the container with id f77ff99260090144f33e7d98c02230ab1f5da35747019a255938171d64d9c8d5 Apr 16 19:41:47.742797 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:41:47.742781 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:41:48.125785 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:41:48.125754 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-54749bfbbc-zgxb4" event={"ID":"88f4eff8-1228-4bfa-ae67-22af8203f8f2","Type":"ContainerStarted","Data":"f77ff99260090144f33e7d98c02230ab1f5da35747019a255938171d64d9c8d5"} Apr 16 19:41:49.129999 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:41:49.129962 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-54749bfbbc-zgxb4" event={"ID":"88f4eff8-1228-4bfa-ae67-22af8203f8f2","Type":"ContainerStarted","Data":"7870df9943a93cd0d1f9f7373bd8b0f9c429db40be457c9741173eba60125edc"} Apr 16 19:41:49.147579 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:41:49.147533 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-54749bfbbc-zgxb4" podStartSLOduration=1.739414193 podStartE2EDuration="2.147518238s" podCreationTimestamp="2026-04-16 19:41:47 +0000 UTC" firstStartedPulling="2026-04-16 19:41:47.742905529 +0000 UTC m=+686.277919955" lastFinishedPulling="2026-04-16 19:41:48.151009571 +0000 UTC m=+686.686024000" observedRunningTime="2026-04-16 19:41:49.146131666 +0000 UTC m=+687.681146115" watchObservedRunningTime="2026-04-16 19:41:49.147518238 +0000 UTC m=+687.682532686" Apr 16 19:42:37.639042 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:37.638966 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-54749bfbbc-zgxb4_88f4eff8-1228-4bfa-ae67-22af8203f8f2/authorino/0.log" Apr 16 19:42:41.635490 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:41.635463 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-86d84b9475-s65qk_1c67254d-f7ef-4df7-a36b-b57a51edc7f9/maas-api/0.log" Apr 16 19:42:42.120043 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:42.120017 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-57586b9555-rbs2j_a2b9be98-fc85-4722-bac5-71d18e928abc/manager/0.log" Apr 16 19:42:43.626413 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:43.626379 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-54749bfbbc-zgxb4_88f4eff8-1228-4bfa-ae67-22af8203f8f2/authorino/0.log" Apr 16 19:42:43.968818 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:43.968792 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-gvqs5_50652469-c81f-48dd-9c1e-a61da64cb5ab/kuadrant-console-plugin/0.log" Apr 16 19:42:45.030023 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:45.029992 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6b5579666b-tvzz2_2e2baf79-53c1-430e-921b-4af11527cc75/kube-auth-proxy/0.log" Apr 16 19:42:50.460617 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:50.460586 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r5h99/must-gather-6j88h"] Apr 16 19:42:50.463871 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:50.463855 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5h99/must-gather-6j88h" Apr 16 19:42:50.466486 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:50.466459 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r5h99\"/\"openshift-service-ca.crt\"" Apr 16 19:42:50.466694 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:50.466681 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r5h99\"/\"kube-root-ca.crt\"" Apr 16 19:42:50.467727 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:50.467701 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-r5h99\"/\"default-dockercfg-5clhq\"" Apr 16 19:42:50.472883 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:50.472862 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r5h99/must-gather-6j88h"] Apr 16 19:42:50.610822 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:50.610787 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/54dc9b29-e6ee-47e9-b405-3b727288c4ed-must-gather-output\") pod \"must-gather-6j88h\" (UID: \"54dc9b29-e6ee-47e9-b405-3b727288c4ed\") " pod="openshift-must-gather-r5h99/must-gather-6j88h" Apr 16 19:42:50.610969 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:50.610854 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6bz5\" (UniqueName: \"kubernetes.io/projected/54dc9b29-e6ee-47e9-b405-3b727288c4ed-kube-api-access-x6bz5\") pod \"must-gather-6j88h\" (UID: \"54dc9b29-e6ee-47e9-b405-3b727288c4ed\") " pod="openshift-must-gather-r5h99/must-gather-6j88h" Apr 16 19:42:50.712267 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:50.712196 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/54dc9b29-e6ee-47e9-b405-3b727288c4ed-must-gather-output\") pod \"must-gather-6j88h\" (UID: \"54dc9b29-e6ee-47e9-b405-3b727288c4ed\") " pod="openshift-must-gather-r5h99/must-gather-6j88h" Apr 16 19:42:50.712267 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:50.712245 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6bz5\" (UniqueName: \"kubernetes.io/projected/54dc9b29-e6ee-47e9-b405-3b727288c4ed-kube-api-access-x6bz5\") pod \"must-gather-6j88h\" (UID: \"54dc9b29-e6ee-47e9-b405-3b727288c4ed\") " pod="openshift-must-gather-r5h99/must-gather-6j88h" Apr 16 19:42:50.712588 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:50.712568 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/54dc9b29-e6ee-47e9-b405-3b727288c4ed-must-gather-output\") pod \"must-gather-6j88h\" (UID: \"54dc9b29-e6ee-47e9-b405-3b727288c4ed\") " pod="openshift-must-gather-r5h99/must-gather-6j88h" Apr 16 19:42:50.720033 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:50.720004 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6bz5\" (UniqueName: \"kubernetes.io/projected/54dc9b29-e6ee-47e9-b405-3b727288c4ed-kube-api-access-x6bz5\") pod \"must-gather-6j88h\" (UID: \"54dc9b29-e6ee-47e9-b405-3b727288c4ed\") " pod="openshift-must-gather-r5h99/must-gather-6j88h" Apr 16 19:42:50.773038 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:50.773015 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5h99/must-gather-6j88h" Apr 16 19:42:50.892675 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:50.892524 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r5h99/must-gather-6j88h"] Apr 16 19:42:50.895401 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:42:50.895373 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54dc9b29_e6ee_47e9_b405_3b727288c4ed.slice/crio-e4288f16b6758098abf85c4d840482b0303982a42dae78f6f090543ee9725a70 WatchSource:0}: Error finding container e4288f16b6758098abf85c4d840482b0303982a42dae78f6f090543ee9725a70: Status 404 returned error can't find the container with id e4288f16b6758098abf85c4d840482b0303982a42dae78f6f090543ee9725a70 Apr 16 19:42:51.327707 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:51.327672 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5h99/must-gather-6j88h" event={"ID":"54dc9b29-e6ee-47e9-b405-3b727288c4ed","Type":"ContainerStarted","Data":"e4288f16b6758098abf85c4d840482b0303982a42dae78f6f090543ee9725a70"} Apr 16 19:42:52.333474 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:52.333440 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5h99/must-gather-6j88h" event={"ID":"54dc9b29-e6ee-47e9-b405-3b727288c4ed","Type":"ContainerStarted","Data":"67a6aacfb747efcc2866378945dea797c2502f7fdef748de385a544715219cd7"} Apr 16 19:42:52.333474 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:52.333474 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5h99/must-gather-6j88h" event={"ID":"54dc9b29-e6ee-47e9-b405-3b727288c4ed","Type":"ContainerStarted","Data":"3c90590c2976fa20fe77f937f570025f0cd7e17b850b053da3e71adb474d7431"} Apr 16 19:42:52.350039 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:52.349994 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r5h99/must-gather-6j88h" podStartSLOduration=1.3900545260000001 podStartE2EDuration="2.349981034s" podCreationTimestamp="2026-04-16 19:42:50 +0000 UTC" firstStartedPulling="2026-04-16 19:42:50.897198965 +0000 UTC m=+749.432213391" lastFinishedPulling="2026-04-16 19:42:51.85712547 +0000 UTC m=+750.392139899" observedRunningTime="2026-04-16 19:42:52.348493578 +0000 UTC m=+750.883508025" watchObservedRunningTime="2026-04-16 19:42:52.349981034 +0000 UTC m=+750.884995481" Apr 16 19:42:53.406043 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:53.406015 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7bgmg_75450654-f95b-4a01-aacb-c02e80738893/global-pull-secret-syncer/0.log" Apr 16 19:42:53.586851 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:53.586817 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vwbhq_be225946-41f7-4fe4-8421-be88c9efe965/konnectivity-agent/0.log" Apr 16 19:42:53.648657 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:53.648623 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-241.ec2.internal_8012d096d8d919883fff307930899c9a/haproxy/0.log" Apr 16 19:42:57.651593 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:57.651548 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-54749bfbbc-zgxb4_88f4eff8-1228-4bfa-ae67-22af8203f8f2/authorino/0.log" Apr 16 19:42:57.734027 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:57.733984 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-gvqs5_50652469-c81f-48dd-9c1e-a61da64cb5ab/kuadrant-console-plugin/0.log" Apr 16 19:42:59.201872 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:59.201839 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f0a646be-775c-46cf-a28a-25936d67d2a3/alertmanager/0.log" Apr 16 19:42:59.228668 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:59.228633 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f0a646be-775c-46cf-a28a-25936d67d2a3/config-reloader/0.log" Apr 16 19:42:59.254122 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:59.254021 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f0a646be-775c-46cf-a28a-25936d67d2a3/kube-rbac-proxy-web/0.log" Apr 16 19:42:59.274521 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:59.274495 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f0a646be-775c-46cf-a28a-25936d67d2a3/kube-rbac-proxy/0.log" Apr 16 19:42:59.300534 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:59.300508 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f0a646be-775c-46cf-a28a-25936d67d2a3/kube-rbac-proxy-metric/0.log" Apr 16 19:42:59.322204 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:59.322179 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f0a646be-775c-46cf-a28a-25936d67d2a3/prom-label-proxy/0.log" Apr 16 19:42:59.346283 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:59.346244 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f0a646be-775c-46cf-a28a-25936d67d2a3/init-config-reloader/0.log" Apr 16 19:42:59.550007 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:59.549932 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7f49bcb978-2xkc2_b207730a-c106-4c00-a949-9facc93e95b6/metrics-server/0.log" Apr 16 19:42:59.616534 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:59.616505 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7jf9f_05235b17-5e57-4ce3-b9d2-70856ad228d0/node-exporter/0.log" Apr 16 19:42:59.641763 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:59.641735 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7jf9f_05235b17-5e57-4ce3-b9d2-70856ad228d0/kube-rbac-proxy/0.log" Apr 16 19:42:59.661389 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:42:59.661362 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7jf9f_05235b17-5e57-4ce3-b9d2-70856ad228d0/init-textfile/0.log" Apr 16 19:43:00.184025 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:00.183998 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-585d4fc4cf-m6m44_059796ae-ded3-4d75-a36b-9d22829219b5/telemeter-client/0.log" Apr 16 19:43:00.204476 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:00.204411 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-585d4fc4cf-m6m44_059796ae-ded3-4d75-a36b-9d22829219b5/reload/0.log" Apr 16 19:43:00.230921 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:00.230891 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-585d4fc4cf-m6m44_059796ae-ded3-4d75-a36b-9d22829219b5/kube-rbac-proxy/0.log" Apr 16 19:43:01.924881 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:01.924848 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt"] Apr 16 19:43:01.931981 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:01.931955 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:01.936457 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:01.936429 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt"] Apr 16 19:43:02.021369 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:02.021331 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e4902a60-61e9-4f11-b09f-fd8993e50008-podres\") pod \"perf-node-gather-daemonset-wnmwt\" (UID: \"e4902a60-61e9-4f11-b09f-fd8993e50008\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:02.021633 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:02.021612 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4902a60-61e9-4f11-b09f-fd8993e50008-lib-modules\") pod \"perf-node-gather-daemonset-wnmwt\" (UID: \"e4902a60-61e9-4f11-b09f-fd8993e50008\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:02.021820 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:02.021801 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e4902a60-61e9-4f11-b09f-fd8993e50008-proc\") pod \"perf-node-gather-daemonset-wnmwt\" (UID: \"e4902a60-61e9-4f11-b09f-fd8993e50008\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:02.021988 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:02.021967 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4zhd\" (UniqueName: \"kubernetes.io/projected/e4902a60-61e9-4f11-b09f-fd8993e50008-kube-api-access-b4zhd\") pod \"perf-node-gather-daemonset-wnmwt\" (UID: \"e4902a60-61e9-4f11-b09f-fd8993e50008\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:02.022171 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:02.022153 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4902a60-61e9-4f11-b09f-fd8993e50008-sys\") pod \"perf-node-gather-daemonset-wnmwt\" (UID: \"e4902a60-61e9-4f11-b09f-fd8993e50008\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:02.122692 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:02.122661 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4902a60-61e9-4f11-b09f-fd8993e50008-lib-modules\") pod \"perf-node-gather-daemonset-wnmwt\" (UID: \"e4902a60-61e9-4f11-b09f-fd8993e50008\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:02.122876 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:02.122713 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e4902a60-61e9-4f11-b09f-fd8993e50008-proc\") pod \"perf-node-gather-daemonset-wnmwt\" (UID: \"e4902a60-61e9-4f11-b09f-fd8993e50008\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:02.122876 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:02.122740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4zhd\" (UniqueName: \"kubernetes.io/projected/e4902a60-61e9-4f11-b09f-fd8993e50008-kube-api-access-b4zhd\") pod \"perf-node-gather-daemonset-wnmwt\" (UID: \"e4902a60-61e9-4f11-b09f-fd8993e50008\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:02.122876 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:02.122787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4902a60-61e9-4f11-b09f-fd8993e50008-sys\") pod \"perf-node-gather-daemonset-wnmwt\" (UID: \"e4902a60-61e9-4f11-b09f-fd8993e50008\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:02.122876 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:02.122828 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e4902a60-61e9-4f11-b09f-fd8993e50008-proc\") pod \"perf-node-gather-daemonset-wnmwt\" (UID: \"e4902a60-61e9-4f11-b09f-fd8993e50008\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:02.122876 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:02.122852 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e4902a60-61e9-4f11-b09f-fd8993e50008-podres\") pod \"perf-node-gather-daemonset-wnmwt\" (UID: \"e4902a60-61e9-4f11-b09f-fd8993e50008\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:02.122876 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:02.122860 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4902a60-61e9-4f11-b09f-fd8993e50008-lib-modules\") pod \"perf-node-gather-daemonset-wnmwt\" (UID: \"e4902a60-61e9-4f11-b09f-fd8993e50008\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:02.123221 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:02.122898 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4902a60-61e9-4f11-b09f-fd8993e50008-sys\") pod \"perf-node-gather-daemonset-wnmwt\" (UID: \"e4902a60-61e9-4f11-b09f-fd8993e50008\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:02.123221 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:02.122946 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e4902a60-61e9-4f11-b09f-fd8993e50008-podres\") pod \"perf-node-gather-daemonset-wnmwt\" (UID: \"e4902a60-61e9-4f11-b09f-fd8993e50008\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:02.131816 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:02.131785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4zhd\" (UniqueName: \"kubernetes.io/projected/e4902a60-61e9-4f11-b09f-fd8993e50008-kube-api-access-b4zhd\") pod \"perf-node-gather-daemonset-wnmwt\" (UID: \"e4902a60-61e9-4f11-b09f-fd8993e50008\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:02.244215 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:02.244134 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:02.422296 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:02.421307 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt"] Apr 16 19:43:02.429770 ip-10-0-133-241 kubenswrapper[2575]: W0416 19:43:02.429080 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode4902a60_61e9_4f11_b09f_fd8993e50008.slice/crio-7708f1527d466bceab4d1c20cad4348a4070f72349a75a04763188403641072b WatchSource:0}: Error finding container 7708f1527d466bceab4d1c20cad4348a4070f72349a75a04763188403641072b: Status 404 returned error can't find the container with id 7708f1527d466bceab4d1c20cad4348a4070f72349a75a04763188403641072b Apr 16 19:43:03.382758 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:03.382724 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" event={"ID":"e4902a60-61e9-4f11-b09f-fd8993e50008","Type":"ContainerStarted","Data":"52c3ab4a632834d626a49f0bdccfe8d69041d2c329325cfc31071b6ed84183f1"} Apr 16 19:43:03.383211 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:03.382765 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" event={"ID":"e4902a60-61e9-4f11-b09f-fd8993e50008","Type":"ContainerStarted","Data":"7708f1527d466bceab4d1c20cad4348a4070f72349a75a04763188403641072b"} Apr 16 19:43:03.383211 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:03.382864 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:03.402903 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:03.402851 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" podStartSLOduration=2.402837315 podStartE2EDuration="2.402837315s" podCreationTimestamp="2026-04-16 19:43:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:43:03.401231312 +0000 UTC m=+761.936245760" watchObservedRunningTime="2026-04-16 19:43:03.402837315 +0000 UTC m=+761.937851794" Apr 16 19:43:03.808155 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:03.808124 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tv9rh_39bb06a3-3f67-42dd-9b2d-1fef39ab08c7/dns/0.log" Apr 16 19:43:03.832307 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:03.832264 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tv9rh_39bb06a3-3f67-42dd-9b2d-1fef39ab08c7/kube-rbac-proxy/0.log" Apr 16 19:43:03.894938 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:03.894912 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9dg6k_ca3720a8-e72a-4e65-a1f7-4270435ae4e1/dns-node-resolver/0.log" Apr 16 19:43:04.419030 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:04.419001 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5f289_e76a993d-4a08-49b0-a7f9-dc97575009ad/node-ca/0.log" Apr 16 19:43:05.329017 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:05.328986 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6b5579666b-tvzz2_2e2baf79-53c1-430e-921b-4af11527cc75/kube-auth-proxy/0.log" Apr 16 19:43:05.912710 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:05.912684 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-q8rl9_5ba63d7e-2cb5-4b7c-8bbd-b135de519a76/serve-healthcheck-canary/0.log" Apr 16 19:43:06.363508 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:06.363428 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-5r8bs_4536532d-6693-4345-ac0d-083b15a27e72/insights-operator/0.log" Apr 16 19:43:06.413089 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:06.413039 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f9rdz_2a2ce730-a721-4a99-8ce5-0e1c3344a897/kube-rbac-proxy/0.log" Apr 16 19:43:06.434783 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:06.434760 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f9rdz_2a2ce730-a721-4a99-8ce5-0e1c3344a897/exporter/0.log" Apr 16 19:43:06.455611 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:06.455586 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f9rdz_2a2ce730-a721-4a99-8ce5-0e1c3344a897/extractor/0.log" Apr 16 19:43:08.349483 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:08.349449 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-86d84b9475-s65qk_1c67254d-f7ef-4df7-a36b-b57a51edc7f9/maas-api/0.log" Apr 16 19:43:08.471907 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:08.471858 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-57586b9555-rbs2j_a2b9be98-fc85-4722-bac5-71d18e928abc/manager/0.log" Apr 16 19:43:09.397627 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:09.397601 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-wnmwt" Apr 16 19:43:09.579433 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:09.579408 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5bfdb756-4zkvs_d16d646f-ae3a-44fb-a958-2091814d236f/manager/0.log" Apr 16 19:43:13.907359 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:13.907323 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-xhdqf_6c4fa5d0-8ce9-444f-9306-774a0f7d068c/migrator/0.log" Apr 16 19:43:13.925903 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:13.925879 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-xhdqf_6c4fa5d0-8ce9-444f-9306-774a0f7d068c/graceful-termination/0.log" Apr 16 19:43:15.196478 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:15.196452 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7vjxb_3bbc206b-84a4-45f8-9836-82284b580174/kube-multus/0.log" Apr 16 19:43:15.381126 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:15.381101 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sslrf_0c96c659-d972-4967-bf3d-e50d4088b9e5/kube-multus-additional-cni-plugins/0.log" Apr 16 19:43:15.400385 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:15.400359 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sslrf_0c96c659-d972-4967-bf3d-e50d4088b9e5/egress-router-binary-copy/0.log" Apr 16 19:43:15.425307 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:15.425274 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sslrf_0c96c659-d972-4967-bf3d-e50d4088b9e5/cni-plugins/0.log" Apr 16 19:43:15.451174 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:15.451126 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sslrf_0c96c659-d972-4967-bf3d-e50d4088b9e5/bond-cni-plugin/0.log" Apr 16 19:43:15.473764 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:15.473738 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sslrf_0c96c659-d972-4967-bf3d-e50d4088b9e5/routeoverride-cni/0.log" Apr 16 19:43:15.493457 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:15.493434 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sslrf_0c96c659-d972-4967-bf3d-e50d4088b9e5/whereabouts-cni-bincopy/0.log" Apr 16 19:43:15.513953 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:15.513934 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sslrf_0c96c659-d972-4967-bf3d-e50d4088b9e5/whereabouts-cni/0.log" Apr 16 19:43:15.792370 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:15.792292 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lxtld_046d5342-ca0b-4fe3-b388-6fa9f477de08/network-metrics-daemon/0.log" Apr 16 19:43:15.812330 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:15.812309 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lxtld_046d5342-ca0b-4fe3-b388-6fa9f477de08/kube-rbac-proxy/0.log" Apr 16 19:43:16.614228 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:16.614203 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxnqw_16c561dd-93b3-4b83-9374-3a46663b8962/ovn-controller/0.log" Apr 16 19:43:16.635337 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:16.635305 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxnqw_16c561dd-93b3-4b83-9374-3a46663b8962/ovn-acl-logging/0.log" Apr 16 19:43:16.652098 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:16.652075 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxnqw_16c561dd-93b3-4b83-9374-3a46663b8962/kube-rbac-proxy-node/0.log" Apr 16 19:43:16.673946 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:16.673928 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxnqw_16c561dd-93b3-4b83-9374-3a46663b8962/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:43:16.692329 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:16.692306 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxnqw_16c561dd-93b3-4b83-9374-3a46663b8962/northd/0.log" Apr 16 19:43:16.711802 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:16.711782 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxnqw_16c561dd-93b3-4b83-9374-3a46663b8962/nbdb/0.log" Apr 16 19:43:16.731524 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:16.731506 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxnqw_16c561dd-93b3-4b83-9374-3a46663b8962/sbdb/0.log" Apr 16 19:43:16.839242 ip-10-0-133-241 kubenswrapper[2575]: I0416 19:43:16.839209 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gxnqw_16c561dd-93b3-4b83-9374-3a46663b8962/ovnkube-controller/0.log"