Apr 22 19:06:08.835826 ip-10-0-129-110 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:06:09.262425 ip-10-0-129-110 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:06:09.262425 ip-10-0-129-110 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:06:09.262425 ip-10-0-129-110 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:06:09.262987 ip-10-0-129-110 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:06:09.262987 ip-10-0-129-110 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:06:09.263431 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.263352 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:06:09.266441 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266427 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:06:09.266441 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266441 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266447 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266452 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266455 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266458 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266461 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266464 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266467 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266469 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266472 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266475 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266477 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266480 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266483 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266486 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266488 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266491 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266493 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266496 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266503 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:06:09.266509 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266506 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266510 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266525 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266528 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266531 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266534 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266537 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266539 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266542 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266544 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266547 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266550 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266552 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266555 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266557 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266560 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266563 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266567 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266569 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266572 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:06:09.267036 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266574 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266577 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266580 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266582 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266585 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266587 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266590 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266592 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266595 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266598 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266600 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266603 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266605 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266608 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266611 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266614 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266616 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266619 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266622 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266624 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:06:09.267527 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266627 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266630 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266633 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266635 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266638 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266640 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266643 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266646 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266648 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266651 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266654 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266657 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266659 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266662 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266667 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266669 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266672 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266674 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266677 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:06:09.268011 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266680 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266683 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266686 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266690 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266693 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.266696 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267090 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267097 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267100 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267103 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267106 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267110 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267114 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267117 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267120 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267123 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267125 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267128 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267130 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:06:09.268464 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267133 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267136 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267139 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267141 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267144 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267146 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267149 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267151 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267155 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267157 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267160 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267162 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267165 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267167 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267170 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267172 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267175 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267177 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267180 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:06:09.268931 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267182 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267186 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267189 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267197 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267201 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267205 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267208 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267210 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267213 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267215 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267218 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267220 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267223 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267226 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267228 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267231 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267233 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267236 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267238 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267241 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:06:09.269400 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267243 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267246 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267249 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267251 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267254 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267256 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267259 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267261 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267264 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267266 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267269 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267271 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267274 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267277 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267280 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267283 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267286 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267289 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267291 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267294 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:06:09.269898 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267296 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267299 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267301 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267304 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267306 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267309 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267312 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267314 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267317 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267319 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267322 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267324 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267327 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.267329 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267403 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267422 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267432 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267439 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267447 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267453 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267459 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:06:09.270428 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267463 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267467 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267470 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267473 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267477 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267480 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267483 2572 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267486 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267489 2572 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267492 2572 flags.go:64] FLAG: --cloud-config="" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267494 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267497 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267501 2572 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267504 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267507 2572 flags.go:64] FLAG: --config-dir="" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267509 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267524 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267528 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267531 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267534 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267538 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267541 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267544 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267547 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267550 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:06:09.270956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267553 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267557 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267561 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267564 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267567 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267570 2572 flags.go:64] FLAG: --enable-server="true" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267573 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267578 2572 flags.go:64] FLAG: --event-burst="100" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267581 2572 flags.go:64] FLAG: --event-qps="50" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267584 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267587 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267590 2572 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267594 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267597 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267600 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267603 2572 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267606 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267609 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267612 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267614 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267617 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267620 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267623 2572 flags.go:64] FLAG: --feature-gates="" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267626 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267630 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:06:09.271566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267633 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267636 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267639 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267642 2572 flags.go:64] FLAG: --help="false" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267645 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-129-110.ec2.internal" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267648 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267655 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267658 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267661 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267665 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267668 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267671 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267674 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267677 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267680 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267683 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267686 2572 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267689 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267692 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267695 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267698 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267701 2572 flags.go:64] FLAG: --lock-file="" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267703 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267706 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:06:09.272214 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267709 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267714 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267717 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267720 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267722 2572 flags.go:64] FLAG: --logging-format="text" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267725 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267729 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267731 2572 flags.go:64] FLAG: --manifest-url="" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267734 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267739 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267742 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267746 2572 flags.go:64] FLAG: --max-pods="110" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267749 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267752 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267756 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267759 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267762 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267765 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267768 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267776 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267779 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267782 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267786 2572 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:06:09.273422 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267789 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267793 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267796 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267800 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267802 2572 flags.go:64] FLAG: --port="10250" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267805 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267808 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-017ab79c0b9624b25" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267811 2572 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267814 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267817 2572 flags.go:64] FLAG: --register-node="true" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267820 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267823 2572 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267826 2572 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267829 2572 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267832 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267837 2572 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267841 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267844 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267847 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267850 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267853 2572 flags.go:64] FLAG: --runonce="false" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267855 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267859 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267862 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267867 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267870 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:06:09.274050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267873 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267876 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267879 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267889 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267892 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267895 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267898 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267901 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267904 2572 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267907 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267912 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267915 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267918 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267921 2572 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267924 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267927 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267930 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267933 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267936 2572 flags.go:64] FLAG: --v="2" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267940 2572 flags.go:64] FLAG: --version="false" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267943 2572 flags.go:64] FLAG: --vmodule="" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267949 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.267952 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268039 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268043 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:06:09.274696 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268046 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268048 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268051 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268054 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268057 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268061 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268064 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268066 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268069 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268072 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268074 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268082 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268085 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268088 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268090 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268093 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268096 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268098 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268101 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268103 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:06:09.275283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268106 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268108 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268111 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268113 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268116 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268119 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268121 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268124 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268129 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268131 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268134 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268136 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268139 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268142 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268144 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268147 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268149 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268153 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268155 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268158 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:06:09.275794 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268160 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268163 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268166 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268168 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268171 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268174 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268176 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268179 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268182 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268184 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268188 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268192 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268194 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268197 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268200 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268202 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268205 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268207 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268210 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:06:09.276287 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268212 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268216 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268218 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268221 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268223 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268226 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268229 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268231 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268234 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268236 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268240 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268243 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268246 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268248 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268251 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268254 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268256 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268259 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268262 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:06:09.276769 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268264 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:06:09.277246 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268268 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:06:09.277246 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268272 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:06:09.277246 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268275 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:06:09.277246 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268277 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:06:09.277246 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.268280 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:06:09.277246 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.268887 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:06:09.277246 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.276026 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:06:09.277246 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.276040 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:06:09.277246 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276095 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:06:09.277246 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276100 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:06:09.277246 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276104 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:06:09.277246 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276107 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:06:09.277246 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276110 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:06:09.277246 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276113 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:06:09.277246 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276118 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276122 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276126 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276129 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276132 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276135 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276137 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276141 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276143 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276147 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276151 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276155 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276158 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276161 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276164 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276167 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276171 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276174 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276176 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276179 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:06:09.277673 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276181 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276184 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276186 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276189 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276191 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276195 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276198 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276200 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276203 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276205 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276208 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276211 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276213 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276216 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276218 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276221 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276224 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276226 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276228 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276231 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:06:09.278160 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276233 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276236 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276240 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276242 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276245 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276247 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276250 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276253 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276255 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276258 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276261 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276264 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276266 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276269 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276271 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276274 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276276 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276279 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276287 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276290 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:06:09.278734 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276293 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276296 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276298 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276301 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276304 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276306 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276309 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276311 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276314 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276316 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276319 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276321 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276324 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276327 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276330 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276333 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276336 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276338 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276341 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:06:09.279214 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276343 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:06:09.279680 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.276349 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:06:09.279680 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276451 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:06:09.279680 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276456 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:06:09.279680 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276460 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:06:09.279680 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276464 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:06:09.279680 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276467 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:06:09.279680 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276470 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:06:09.279680 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276473 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:06:09.279680 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276475 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:06:09.279680 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276478 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:06:09.279680 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276481 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:06:09.279680 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276484 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:06:09.279680 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276486 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:06:09.279680 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276489 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:06:09.279680 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276492 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276494 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276497 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276499 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276502 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276504 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276507 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276522 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276527 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276530 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276533 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276549 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276553 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276556 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276559 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276562 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276565 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276568 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276570 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:06:09.280048 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276573 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276577 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276580 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276582 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276585 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276587 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276590 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276592 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276595 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276599 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276603 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276606 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276608 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276611 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276614 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276616 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276619 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276621 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276624 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276626 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:06:09.280506 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276629 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276631 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276634 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276637 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276639 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276642 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276644 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276647 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276649 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276651 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276654 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276656 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276659 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276661 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276664 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276667 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276669 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276672 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276674 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276676 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:06:09.280996 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276679 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:06:09.281482 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276682 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:06:09.281482 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276685 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:06:09.281482 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276687 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:06:09.281482 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276690 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:06:09.281482 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276692 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:06:09.281482 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276695 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:06:09.281482 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276698 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:06:09.281482 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276700 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:06:09.281482 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276703 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:06:09.281482 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276705 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:06:09.281482 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276708 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:06:09.281482 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276710 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:06:09.281482 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:09.276713 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:06:09.281482 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.276717 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:06:09.281482 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.277307 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:06:09.281861 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.279230 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:06:09.281861 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.280165 2572 server.go:1019] "Starting client certificate rotation" Apr 22 19:06:09.281861 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.280262 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:06:09.281861 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.280320 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:06:09.304674 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.304658 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:06:09.308137 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.308116 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:06:09.318957 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.318940 2572 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:06:09.324270 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.324247 2572 log.go:25] "Validated CRI v1 image API" Apr 22 19:06:09.326297 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.326284 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:06:09.329848 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.329824 2572 fs.go:135] Filesystem UUIDs: map[6f76bb3b-5778-42f7-8132-e89b499472ba:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 87f0a093-7db1-42d8-bd91-016250ebb897:/dev/nvme0n1p4] Apr 22 19:06:09.329909 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.329850 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:06:09.333161 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.333144 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:06:09.335682 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.335577 2572 manager.go:217] Machine: {Timestamp:2026-04-22 19:06:09.333468272 +0000 UTC m=+0.381224469 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100473 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec229116456f77b056278401ff2a2ed0 SystemUUID:ec229116-456f-77b0-5627-8401ff2a2ed0 BootID:ae294a9a-9ad4-45bb-929b-b085d40fded6 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:97:76:e5:b4:55 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:97:76:e5:b4:55 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b6:e3:f2:2c:c3:6f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:06:09.335682 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.335678 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:06:09.335791 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.335747 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:06:09.336904 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.336880 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:06:09.337029 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.336906 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-110.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:06:09.337073 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.337038 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:06:09.337073 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.337045 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:06:09.337073 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.337058 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:06:09.337764 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.337753 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:06:09.338555 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.338545 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:06:09.338673 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.338665 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:06:09.342252 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.342242 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:06:09.342289 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.342261 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:06:09.342289 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.342274 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:06:09.342289 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.342282 2572 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:06:09.342392 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.342291 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:06:09.343228 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.343217 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:06:09.343271 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.343234 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:06:09.346063 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.346040 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:06:09.347837 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.347824 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:06:09.348993 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.348983 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:06:09.349028 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.348999 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:06:09.349028 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.349006 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:06:09.349028 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.349011 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:06:09.349028 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.349017 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:06:09.349028 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.349022 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:06:09.349028 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.349027 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:06:09.349180 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.349033 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:06:09.349180 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.349040 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:06:09.349180 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.349045 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:06:09.349180 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.349058 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:06:09.349180 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.349067 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:06:09.350403 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.350393 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:06:09.350403 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.350403 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:06:09.353877 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.353863 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:06:09.353944 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.353896 2572 server.go:1295] "Started kubelet" Apr 22 19:06:09.354017 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.353989 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:06:09.354204 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.354095 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:06:09.354331 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.354319 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:06:09.354702 ip-10-0-129-110 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:06:09.357275 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.357254 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:06:09.359783 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.359766 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:06:09.360047 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.360026 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-110.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:06:09.360715 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.360689 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-110.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:06:09.360823 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.360700 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:06:09.364119 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.363327 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-110.ec2.internal.18a8c3488c23bae5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-110.ec2.internal,UID:ip-10-0-129-110.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-110.ec2.internal,},FirstTimestamp:2026-04-22 19:06:09.353874149 +0000 UTC m=+0.401630346,LastTimestamp:2026-04-22 19:06:09.353874149 +0000 UTC m=+0.401630346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-110.ec2.internal,}" Apr 22 19:06:09.365099 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.365057 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:06:09.365560 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.365531 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:06:09.366264 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.366242 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:06:09.366264 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.366243 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:06:09.366392 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.366270 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:06:09.366392 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.366382 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:06:09.366392 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.366390 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:06:09.367005 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.366818 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-110.ec2.internal\" not found" Apr 22 19:06:09.367152 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.367133 2572 factory.go:153] Registering CRI-O factory Apr 22 19:06:09.367239 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.367186 2572 factory.go:223] Registration of the crio container factory successfully Apr 22 19:06:09.367310 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.367279 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:06:09.367410 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.367400 2572 factory.go:55] Registering systemd factory Apr 22 19:06:09.367474 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.367420 2572 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:06:09.367474 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.367440 2572 factory.go:103] Registering Raw factory Apr 22 19:06:09.367474 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.367453 2572 manager.go:1196] Started watching for new ooms in manager Apr 22 19:06:09.367780 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.367740 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:06:09.367903 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.367887 2572 manager.go:319] Starting recovery of all containers Apr 22 19:06:09.370680 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.370652 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 19:06:09.370808 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.370784 2572 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-110.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 19:06:09.378392 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.378376 2572 manager.go:324] Recovery completed Apr 22 19:06:09.382184 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.382171 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:06:09.384428 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.384413 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:06:09.384486 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.384441 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:06:09.384486 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.384451 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:06:09.384952 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.384937 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:06:09.384952 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.384949 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:06:09.385068 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.384963 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:06:09.386631 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.386561 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-110.ec2.internal.18a8c3488df5f6f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-110.ec2.internal,UID:ip-10-0-129-110.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-110.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-110.ec2.internal,},FirstTimestamp:2026-04-22 19:06:09.3844293 +0000 UTC m=+0.432185498,LastTimestamp:2026-04-22 19:06:09.3844293 +0000 UTC m=+0.432185498,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-110.ec2.internal,}" Apr 22 19:06:09.387114 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.387101 2572 policy_none.go:49] "None policy: Start" Apr 22 19:06:09.387165 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.387120 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:06:09.387165 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.387130 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:06:09.396033 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.395969 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-110.ec2.internal.18a8c3488df636b5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-110.ec2.internal,UID:ip-10-0-129-110.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-129-110.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-129-110.ec2.internal,},FirstTimestamp:2026-04-22 19:06:09.384445621 +0000 UTC m=+0.432201818,LastTimestamp:2026-04-22 19:06:09.384445621 +0000 UTC m=+0.432201818,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-110.ec2.internal,}" Apr 22 19:06:09.399747 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.399730 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4hrqz" Apr 22 19:06:09.405119 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.405100 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4hrqz" Apr 22 19:06:09.405773 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.405703 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-110.ec2.internal.18a8c3488df65bec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-110.ec2.internal,UID:ip-10-0-129-110.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-129-110.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-129-110.ec2.internal,},FirstTimestamp:2026-04-22 19:06:09.384455148 +0000 UTC m=+0.432211345,LastTimestamp:2026-04-22 19:06:09.384455148 +0000 UTC m=+0.432211345,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-110.ec2.internal,}" Apr 22 19:06:09.427074 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.427062 2572 manager.go:341] "Starting Device Plugin manager" Apr 22 19:06:09.432482 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.427088 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:06:09.432482 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.427096 2572 server.go:85] "Starting device plugin registration server" Apr 22 19:06:09.432482 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.427272 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:06:09.432482 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.427281 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:06:09.432482 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.427360 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:06:09.432482 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.427435 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:06:09.432482 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.427443 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:06:09.432482 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.428863 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:06:09.432482 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.428897 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-110.ec2.internal\" not found" Apr 22 19:06:09.453482 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.453459 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:06:09.454643 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.454625 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:06:09.454720 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.454648 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:06:09.454720 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.454664 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:06:09.454720 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.454671 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:06:09.454720 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.454699 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:06:09.458260 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.458243 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:06:09.528139 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.528093 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:06:09.529386 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.529361 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:06:09.529459 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.529392 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:06:09.529459 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.529402 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:06:09.529459 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.529424 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-110.ec2.internal" Apr 22 19:06:09.542010 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.541993 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-110.ec2.internal" Apr 22 19:06:09.542058 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.542014 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-110.ec2.internal\": node \"ip-10-0-129-110.ec2.internal\" not found" Apr 22 19:06:09.555438 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.555417 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-110.ec2.internal"] Apr 22 19:06:09.555536 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.555490 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:06:09.556220 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.556207 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:06:09.556285 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.556235 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:06:09.556285 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.556248 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:06:09.557341 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.557329 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:06:09.557470 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.557456 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal" Apr 22 19:06:09.557506 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.557484 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:06:09.557986 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.557966 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:06:09.558048 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.557990 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:06:09.558048 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.557972 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:06:09.558048 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.558005 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:06:09.558048 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.558018 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:06:09.558048 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.558030 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:06:09.559168 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.559154 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-110.ec2.internal" Apr 22 19:06:09.559210 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.559177 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:06:09.559816 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.559799 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:06:09.559883 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.559830 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:06:09.559883 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.559843 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:06:09.578032 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.578014 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-110.ec2.internal\" not found" node="ip-10-0-129-110.ec2.internal" Apr 22 19:06:09.582322 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.582307 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-110.ec2.internal\" not found" node="ip-10-0-129-110.ec2.internal" Apr 22 19:06:09.587912 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.587896 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-110.ec2.internal\" not found" Apr 22 19:06:09.668236 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.668216 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/54e4518df75bf5ebe24281d874521911-config\") pod \"kube-apiserver-proxy-ip-10-0-129-110.ec2.internal\" (UID: \"54e4518df75bf5ebe24281d874521911\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-110.ec2.internal" Apr 22 19:06:09.668317 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.668239 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2759787ebcf5e23cbf021ba7f2970669-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal\" (UID: \"2759787ebcf5e23cbf021ba7f2970669\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal" Apr 22 19:06:09.668317 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.668259 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2759787ebcf5e23cbf021ba7f2970669-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal\" (UID: \"2759787ebcf5e23cbf021ba7f2970669\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal" Apr 22 19:06:09.688552 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.688525 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-110.ec2.internal\" not found" Apr 22 19:06:09.769251 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.769206 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/54e4518df75bf5ebe24281d874521911-config\") pod \"kube-apiserver-proxy-ip-10-0-129-110.ec2.internal\" (UID: \"54e4518df75bf5ebe24281d874521911\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-110.ec2.internal" Apr 22 19:06:09.769370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.769265 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2759787ebcf5e23cbf021ba7f2970669-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal\" (UID: \"2759787ebcf5e23cbf021ba7f2970669\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal" Apr 22 19:06:09.769370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.769282 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2759787ebcf5e23cbf021ba7f2970669-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal\" (UID: \"2759787ebcf5e23cbf021ba7f2970669\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal" Apr 22 19:06:09.769370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.769310 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2759787ebcf5e23cbf021ba7f2970669-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal\" (UID: \"2759787ebcf5e23cbf021ba7f2970669\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal" Apr 22 19:06:09.769370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.769224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/54e4518df75bf5ebe24281d874521911-config\") pod \"kube-apiserver-proxy-ip-10-0-129-110.ec2.internal\" (UID: \"54e4518df75bf5ebe24281d874521911\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-110.ec2.internal" Apr 22 19:06:09.769370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.769369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2759787ebcf5e23cbf021ba7f2970669-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal\" (UID: \"2759787ebcf5e23cbf021ba7f2970669\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal" Apr 22 19:06:09.789415 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.789368 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-110.ec2.internal\" not found" Apr 22 19:06:09.880185 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.880159 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal" Apr 22 19:06:09.884727 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:09.884711 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-110.ec2.internal" Apr 22 19:06:09.890277 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.890261 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-110.ec2.internal\" not found" Apr 22 19:06:09.990735 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:09.990709 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-110.ec2.internal\" not found" Apr 22 19:06:10.091190 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:10.091143 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-110.ec2.internal\" not found" Apr 22 19:06:10.184423 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.184396 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:06:10.191375 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:10.191353 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-110.ec2.internal\" not found" Apr 22 19:06:10.279865 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.279842 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:06:10.280456 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.279943 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:06:10.280456 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.279991 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:06:10.291645 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:10.291629 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-110.ec2.internal\" not found" Apr 22 19:06:10.296833 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.296815 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:06:10.343389 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.343343 2572 apiserver.go:52] "Watching apiserver" Apr 22 19:06:10.352144 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.352125 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:06:10.352481 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.352459 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-jhwl6","openshift-multus/multus-additional-cni-plugins-ghkl8","openshift-multus/multus-p7t9t","openshift-multus/network-metrics-daemon-hvrqj","openshift-network-diagnostics/network-check-target-djgz6","openshift-network-operator/iptables-alerter-c8tcr","kube-system/konnectivity-agent-jr9sz","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc","openshift-cluster-node-tuning-operator/tuned-bl265","openshift-image-registry/node-ca-tbt7s","openshift-ovn-kubernetes/ovnkube-node-v4fm9"] Apr 22 19:06:10.353878 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.353832 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jhwl6" Apr 22 19:06:10.356347 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.356330 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-546cr\"" Apr 22 19:06:10.356550 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.356364 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:06:10.356550 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.356419 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:06:10.357010 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.356879 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.357010 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.356915 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.358354 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.358139 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:10.358354 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:10.358224 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hvrqj" podUID="627cf532-d693-4215-85b9-807d744857ce" Apr 22 19:06:10.359285 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.359262 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:10.359403 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:10.359328 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-djgz6" podUID="ac07db1d-d903-453c-9c47-68daab7361ad" Apr 22 19:06:10.360099 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.359817 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:06:10.360099 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.359868 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:06:10.360099 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.359877 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-qwmh9\"" Apr 22 19:06:10.360099 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.359939 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:06:10.360099 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.359979 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:06:10.360099 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.359981 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:06:10.360099 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.360043 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:06:10.360604 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.360587 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-c8tcr" Apr 22 19:06:10.360698 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.360683 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jp9nb\"" Apr 22 19:06:10.361858 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.361842 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jr9sz" Apr 22 19:06:10.362877 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.362860 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.362980 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.362943 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:06:10.362980 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.362963 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ddjjq\"" Apr 22 19:06:10.363085 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.363056 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:06:10.363317 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.363299 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:06:10.363886 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.363868 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:06:10.363987 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.363872 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:06:10.364050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.363995 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.364116 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.364101 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-nd968\"" Apr 22 19:06:10.364990 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.364974 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:06:10.365167 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.365155 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:06:10.365282 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.365269 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tbt7s" Apr 22 19:06:10.365493 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.365434 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:06:10.365604 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.365547 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:06:10.365944 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.365928 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5zjtj\"" Apr 22 19:06:10.366026 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.366008 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:06:10.366081 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.366029 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal" Apr 22 19:06:10.366483 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.366466 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9rstv\"" Apr 22 19:06:10.366483 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.366480 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.366639 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.366577 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:06:10.367618 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.367559 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:06:10.367740 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.367668 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:06:10.369965 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.369943 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:06:10.370479 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.370460 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:06:10.370818 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.370583 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:06:10.370818 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.370594 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-zh7lz\"" Apr 22 19:06:10.370818 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.370632 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:06:10.370818 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.370665 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:06:10.370818 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.370680 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:06:10.370818 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.370587 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:06:10.371068 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.371013 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8j5vf\"" Apr 22 19:06:10.371866 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.371846 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn8rr\" (UniqueName: \"kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr\") pod \"network-check-target-djgz6\" (UID: \"ac07db1d-d903-453c-9c47-68daab7361ad\") " pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:10.371968 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.371883 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-etc-sysconfig\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.371968 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.371908 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-etc-sysctl-conf\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.371968 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.371930 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-cnibin\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.371968 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.371953 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db7cc212-874b-4767-a85f-3393efedb1fa-cni-binary-copy\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.372181 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.371998 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-host-run-netns\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.372181 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372050 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/64b28fc4-2a3b-4f1c-8433-a06545cb072a-sys-fs\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.372181 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372085 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.372181 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372130 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a1a4c2e8-7247-484d-9439-dc4d46888d9b-cnibin\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.372181 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372156 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-etc-modprobe-d\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.372424 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372183 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-etc-systemd\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.372424 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372208 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-run\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.372424 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372234 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-var-lib-kubelet\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.372424 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372259 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-system-cni-dir\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.372424 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372290 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-var-lib-openvswitch\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.372424 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372329 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-lib-modules\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.372424 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372355 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtdzl\" (UniqueName: \"kubernetes.io/projected/73fc4728-1f64-4c14-858a-1e77088a0308-kube-api-access-rtdzl\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.372424 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372380 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-host-var-lib-cni-bin\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.372424 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372407 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/915929b1-126d-4f5a-8427-721f83701d23-host-slash\") pod \"iptables-alerter-c8tcr\" (UID: \"915929b1-126d-4f5a-8427-721f83701d23\") " pod="openshift-network-operator/iptables-alerter-c8tcr" Apr 22 19:06:10.372871 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372433 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/64b28fc4-2a3b-4f1c-8433-a06545cb072a-device-dir\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.372871 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372471 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-run-ovn-kubernetes\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.372871 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372497 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-cni-bin\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.372871 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372540 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9twg\" (UniqueName: \"kubernetes.io/projected/40723a08-baf4-4bac-8032-9853f6f1a2e2-kube-api-access-r9twg\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.372871 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372568 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzqcc\" (UniqueName: \"kubernetes.io/projected/6756b0b3-8e30-47c8-925b-478ee2126fcc-kube-api-access-kzqcc\") pod \"node-ca-tbt7s\" (UID: \"6756b0b3-8e30-47c8-925b-478ee2126fcc\") " pod="openshift-image-registry/node-ca-tbt7s" Apr 22 19:06:10.372871 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372597 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-run-systemd\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.372871 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372623 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-run-openvswitch\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.372871 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372646 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-run-ovn\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.372871 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372685 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40723a08-baf4-4bac-8032-9853f6f1a2e2-env-overrides\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.372871 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372709 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/db7cc212-874b-4767-a85f-3393efedb1fa-multus-daemon-config\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.372871 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372739 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-host-run-multus-certs\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.372871 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372779 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjnt8\" (UniqueName: \"kubernetes.io/projected/915929b1-126d-4f5a-8427-721f83701d23-kube-api-access-rjnt8\") pod \"iptables-alerter-c8tcr\" (UID: \"915929b1-126d-4f5a-8427-721f83701d23\") " pod="openshift-network-operator/iptables-alerter-c8tcr" Apr 22 19:06:10.372871 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372806 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40723a08-baf4-4bac-8032-9853f6f1a2e2-ovn-node-metrics-cert\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.372871 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372843 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-host\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.372871 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372869 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-etc-kubernetes\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.373594 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372894 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-etc-sysctl-d\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.373594 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372924 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-os-release\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.373594 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372949 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-multus-socket-dir-parent\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.373594 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.372980 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-multus-conf-dir\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.373594 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373006 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/64b28fc4-2a3b-4f1c-8433-a06545cb072a-registration-dir\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.373594 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373039 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/64b28fc4-2a3b-4f1c-8433-a06545cb072a-etc-selinux\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.373594 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373060 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a1a4c2e8-7247-484d-9439-dc4d46888d9b-cni-binary-copy\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.373594 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373089 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a1a4c2e8-7247-484d-9439-dc4d46888d9b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.373594 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373119 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/915929b1-126d-4f5a-8427-721f83701d23-iptables-alerter-script\") pod \"iptables-alerter-c8tcr\" (UID: \"915929b1-126d-4f5a-8427-721f83701d23\") " pod="openshift-network-operator/iptables-alerter-c8tcr" Apr 22 19:06:10.373594 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373146 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/da0eed59-92c9-4593-b2f3-3ec47cc8c911-konnectivity-ca\") pod \"konnectivity-agent-jr9sz\" (UID: \"da0eed59-92c9-4593-b2f3-3ec47cc8c911\") " pod="kube-system/konnectivity-agent-jr9sz" Apr 22 19:06:10.373594 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373191 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-etc-openvswitch\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.373594 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373220 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a1a4c2e8-7247-484d-9439-dc4d46888d9b-system-cni-dir\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.373594 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373248 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7228j\" (UniqueName: \"kubernetes.io/projected/64b28fc4-2a3b-4f1c-8433-a06545cb072a-kube-api-access-7228j\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.373594 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373274 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-node-log\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.373594 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373300 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40723a08-baf4-4bac-8032-9853f6f1a2e2-ovnkube-config\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.373594 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373326 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c470388d-c98e-482f-9c89-a240f3abac2d-hosts-file\") pod \"node-resolver-jhwl6\" (UID: \"c470388d-c98e-482f-9c89-a240f3abac2d\") " pod="openshift-dns/node-resolver-jhwl6" Apr 22 19:06:10.374444 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373351 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrgsm\" (UniqueName: \"kubernetes.io/projected/c470388d-c98e-482f-9c89-a240f3abac2d-kube-api-access-hrgsm\") pod \"node-resolver-jhwl6\" (UID: \"c470388d-c98e-482f-9c89-a240f3abac2d\") " pod="openshift-dns/node-resolver-jhwl6" Apr 22 19:06:10.374444 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373376 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-etc-kubernetes\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.374444 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373400 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-hostroot\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.374444 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373430 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-slash\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.374444 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373453 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/73fc4728-1f64-4c14-858a-1e77088a0308-tmp\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.374444 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373479 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-kubelet\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.374444 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373505 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-sys\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.374444 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373543 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khccv\" (UniqueName: \"kubernetes.io/projected/db7cc212-874b-4767-a85f-3393efedb1fa-kube-api-access-khccv\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.374444 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373566 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/da0eed59-92c9-4593-b2f3-3ec47cc8c911-agent-certs\") pod \"konnectivity-agent-jr9sz\" (UID: \"da0eed59-92c9-4593-b2f3-3ec47cc8c911\") " pod="kube-system/konnectivity-agent-jr9sz" Apr 22 19:06:10.374444 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373594 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/64b28fc4-2a3b-4f1c-8433-a06545cb072a-socket-dir\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.374444 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373620 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-systemd-units\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.374444 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373645 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-run-netns\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.374444 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373672 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-cni-netd\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.374444 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373697 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a1a4c2e8-7247-484d-9439-dc4d46888d9b-os-release\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.374444 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373722 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-multus-cni-dir\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.374444 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373761 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6756b0b3-8e30-47c8-925b-478ee2126fcc-serviceca\") pod \"node-ca-tbt7s\" (UID: \"6756b0b3-8e30-47c8-925b-478ee2126fcc\") " pod="openshift-image-registry/node-ca-tbt7s" Apr 22 19:06:10.374444 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373815 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krkpc\" (UniqueName: \"kubernetes.io/projected/a1a4c2e8-7247-484d-9439-dc4d46888d9b-kube-api-access-krkpc\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.375246 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373857 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-host-run-k8s-cni-cncf-io\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.375246 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373903 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs\") pod \"network-metrics-daemon-hvrqj\" (UID: \"627cf532-d693-4215-85b9-807d744857ce\") " pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:10.375246 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373935 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7jjk\" (UniqueName: \"kubernetes.io/projected/627cf532-d693-4215-85b9-807d744857ce-kube-api-access-j7jjk\") pod \"network-metrics-daemon-hvrqj\" (UID: \"627cf532-d693-4215-85b9-807d744857ce\") " pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:10.375246 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.373977 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/73fc4728-1f64-4c14-858a-1e77088a0308-etc-tuned\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.375246 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.374003 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-host-var-lib-kubelet\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.375246 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.374021 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/64b28fc4-2a3b-4f1c-8433-a06545cb072a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.375246 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.374041 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40723a08-baf4-4bac-8032-9853f6f1a2e2-ovnkube-script-lib\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.375246 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.374095 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c470388d-c98e-482f-9c89-a240f3abac2d-tmp-dir\") pod \"node-resolver-jhwl6\" (UID: \"c470388d-c98e-482f-9c89-a240f3abac2d\") " pod="openshift-dns/node-resolver-jhwl6" Apr 22 19:06:10.375246 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.374129 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a1a4c2e8-7247-484d-9439-dc4d46888d9b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.375246 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.374156 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a1a4c2e8-7247-484d-9439-dc4d46888d9b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.375246 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.374180 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-host-var-lib-cni-multus\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.375246 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.374203 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6756b0b3-8e30-47c8-925b-478ee2126fcc-host\") pod \"node-ca-tbt7s\" (UID: \"6756b0b3-8e30-47c8-925b-478ee2126fcc\") " pod="openshift-image-registry/node-ca-tbt7s" Apr 22 19:06:10.375246 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.374223 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-log-socket\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.380643 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.380620 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:06:10.380786 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.380771 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal"] Apr 22 19:06:10.381650 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.381635 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:06:10.381712 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.381702 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-110.ec2.internal" Apr 22 19:06:10.388352 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.388335 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:06:10.388445 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.388375 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-129-110.ec2.internal"] Apr 22 19:06:10.404626 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.404598 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-dhw67" Apr 22 19:06:10.407043 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.407007 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:01:09 +0000 UTC" deadline="2027-11-02 15:26:47.962990128 +0000 UTC" Apr 22 19:06:10.407122 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.407043 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13412h20m37.555949845s" Apr 22 19:06:10.413279 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.413262 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-dhw67" Apr 22 19:06:10.464372 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:10.464318 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2759787ebcf5e23cbf021ba7f2970669.slice/crio-2b3b4a7037377ae042aacec7bf2deb0afbfe99f23e16e863390a4c2b0df7b355 WatchSource:0}: Error finding container 2b3b4a7037377ae042aacec7bf2deb0afbfe99f23e16e863390a4c2b0df7b355: Status 404 returned error can't find the container with id 2b3b4a7037377ae042aacec7bf2deb0afbfe99f23e16e863390a4c2b0df7b355 Apr 22 19:06:10.464866 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:10.464841 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54e4518df75bf5ebe24281d874521911.slice/crio-3bbdfd97fe0860c5ec1c523b0969ba6800523b51f471551f1b5a690dfd845bdb WatchSource:0}: Error finding container 3bbdfd97fe0860c5ec1c523b0969ba6800523b51f471551f1b5a690dfd845bdb: Status 404 returned error can't find the container with id 3bbdfd97fe0860c5ec1c523b0969ba6800523b51f471551f1b5a690dfd845bdb Apr 22 19:06:10.466720 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.466702 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:06:10.470038 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.470025 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:06:10.474558 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474539 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-etc-sysctl-d\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.474627 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474565 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-os-release\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.474627 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-multus-socket-dir-parent\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.474627 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474596 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-multus-conf-dir\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.474627 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474618 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/64b28fc4-2a3b-4f1c-8433-a06545cb072a-registration-dir\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.474798 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474639 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/64b28fc4-2a3b-4f1c-8433-a06545cb072a-etc-selinux\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.474798 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474660 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a1a4c2e8-7247-484d-9439-dc4d46888d9b-cni-binary-copy\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.474798 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474674 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-multus-conf-dir\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.474798 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474682 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-multus-socket-dir-parent\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.474798 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474697 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/64b28fc4-2a3b-4f1c-8433-a06545cb072a-registration-dir\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.474798 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474705 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-etc-sysctl-d\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.474798 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474659 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-os-release\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.474798 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474707 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/64b28fc4-2a3b-4f1c-8433-a06545cb072a-etc-selinux\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.474798 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474685 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a1a4c2e8-7247-484d-9439-dc4d46888d9b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.474798 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474772 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/915929b1-126d-4f5a-8427-721f83701d23-iptables-alerter-script\") pod \"iptables-alerter-c8tcr\" (UID: \"915929b1-126d-4f5a-8427-721f83701d23\") " pod="openshift-network-operator/iptables-alerter-c8tcr" Apr 22 19:06:10.475223 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474802 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/da0eed59-92c9-4593-b2f3-3ec47cc8c911-konnectivity-ca\") pod \"konnectivity-agent-jr9sz\" (UID: \"da0eed59-92c9-4593-b2f3-3ec47cc8c911\") " pod="kube-system/konnectivity-agent-jr9sz" Apr 22 19:06:10.475223 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474828 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-etc-openvswitch\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.475223 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474859 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a1a4c2e8-7247-484d-9439-dc4d46888d9b-system-cni-dir\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.475223 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474885 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7228j\" (UniqueName: \"kubernetes.io/projected/64b28fc4-2a3b-4f1c-8433-a06545cb072a-kube-api-access-7228j\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.475223 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474909 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-node-log\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.475223 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474934 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40723a08-baf4-4bac-8032-9853f6f1a2e2-ovnkube-config\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.475223 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474960 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c470388d-c98e-482f-9c89-a240f3abac2d-hosts-file\") pod \"node-resolver-jhwl6\" (UID: \"c470388d-c98e-482f-9c89-a240f3abac2d\") " pod="openshift-dns/node-resolver-jhwl6" Apr 22 19:06:10.475223 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474965 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-etc-openvswitch\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.475223 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475052 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrgsm\" (UniqueName: \"kubernetes.io/projected/c470388d-c98e-482f-9c89-a240f3abac2d-kube-api-access-hrgsm\") pod \"node-resolver-jhwl6\" (UID: \"c470388d-c98e-482f-9c89-a240f3abac2d\") " pod="openshift-dns/node-resolver-jhwl6" Apr 22 19:06:10.475223 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475082 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-etc-kubernetes\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.475223 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475107 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-hostroot\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.475223 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475131 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-slash\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.475223 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475195 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-node-log\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.475223 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/73fc4728-1f64-4c14-858a-1e77088a0308-tmp\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.475884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475238 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-kubelet\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.475884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475265 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-sys\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.475884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475291 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khccv\" (UniqueName: \"kubernetes.io/projected/db7cc212-874b-4767-a85f-3393efedb1fa-kube-api-access-khccv\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.475884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475294 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/915929b1-126d-4f5a-8427-721f83701d23-iptables-alerter-script\") pod \"iptables-alerter-c8tcr\" (UID: \"915929b1-126d-4f5a-8427-721f83701d23\") " pod="openshift-network-operator/iptables-alerter-c8tcr" Apr 22 19:06:10.475884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475288 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a1a4c2e8-7247-484d-9439-dc4d46888d9b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.475884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475301 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a1a4c2e8-7247-484d-9439-dc4d46888d9b-cni-binary-copy\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.475884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.474936 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a1a4c2e8-7247-484d-9439-dc4d46888d9b-system-cni-dir\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.475884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475331 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-kubelet\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.475884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475360 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/da0eed59-92c9-4593-b2f3-3ec47cc8c911-agent-certs\") pod \"konnectivity-agent-jr9sz\" (UID: \"da0eed59-92c9-4593-b2f3-3ec47cc8c911\") " pod="kube-system/konnectivity-agent-jr9sz" Apr 22 19:06:10.475884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475371 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-sys\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.475884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475395 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-etc-kubernetes\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.475884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475400 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/64b28fc4-2a3b-4f1c-8433-a06545cb072a-socket-dir\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.475884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475444 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-systemd-units\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.475884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475447 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/da0eed59-92c9-4593-b2f3-3ec47cc8c911-konnectivity-ca\") pod \"konnectivity-agent-jr9sz\" (UID: \"da0eed59-92c9-4593-b2f3-3ec47cc8c911\") " pod="kube-system/konnectivity-agent-jr9sz" Apr 22 19:06:10.475884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475486 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-run-netns\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.475884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475554 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:06:10.475884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475572 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-run-netns\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.475884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475609 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40723a08-baf4-4bac-8032-9853f6f1a2e2-ovnkube-config\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.476761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475623 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-slash\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.476761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475667 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-hostroot\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.476761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-cni-netd\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.476761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475687 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-systemd-units\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.476761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475709 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-cni-netd\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.476761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475716 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a1a4c2e8-7247-484d-9439-dc4d46888d9b-os-release\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.476761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475726 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c470388d-c98e-482f-9c89-a240f3abac2d-hosts-file\") pod \"node-resolver-jhwl6\" (UID: \"c470388d-c98e-482f-9c89-a240f3abac2d\") " pod="openshift-dns/node-resolver-jhwl6" Apr 22 19:06:10.476761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475726 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/64b28fc4-2a3b-4f1c-8433-a06545cb072a-socket-dir\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.476761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475745 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-multus-cni-dir\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.476761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475771 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6756b0b3-8e30-47c8-925b-478ee2126fcc-serviceca\") pod \"node-ca-tbt7s\" (UID: \"6756b0b3-8e30-47c8-925b-478ee2126fcc\") " pod="openshift-image-registry/node-ca-tbt7s" Apr 22 19:06:10.476761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475797 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krkpc\" (UniqueName: \"kubernetes.io/projected/a1a4c2e8-7247-484d-9439-dc4d46888d9b-kube-api-access-krkpc\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.476761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475818 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a1a4c2e8-7247-484d-9439-dc4d46888d9b-os-release\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.476761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475841 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-host-run-k8s-cni-cncf-io\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.476761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475845 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-multus-cni-dir\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.476761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475887 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-host-run-k8s-cni-cncf-io\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.476761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475917 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs\") pod \"network-metrics-daemon-hvrqj\" (UID: \"627cf532-d693-4215-85b9-807d744857ce\") " pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:10.476761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475945 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7jjk\" (UniqueName: \"kubernetes.io/projected/627cf532-d693-4215-85b9-807d744857ce-kube-api-access-j7jjk\") pod \"network-metrics-daemon-hvrqj\" (UID: \"627cf532-d693-4215-85b9-807d744857ce\") " pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:10.476761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475970 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/73fc4728-1f64-4c14-858a-1e77088a0308-etc-tuned\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.477600 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.475994 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-host-var-lib-kubelet\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.477600 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476018 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/64b28fc4-2a3b-4f1c-8433-a06545cb072a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.477600 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476044 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40723a08-baf4-4bac-8032-9853f6f1a2e2-ovnkube-script-lib\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.477600 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476069 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c470388d-c98e-482f-9c89-a240f3abac2d-tmp-dir\") pod \"node-resolver-jhwl6\" (UID: \"c470388d-c98e-482f-9c89-a240f3abac2d\") " pod="openshift-dns/node-resolver-jhwl6" Apr 22 19:06:10.477600 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476094 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a1a4c2e8-7247-484d-9439-dc4d46888d9b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.477600 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:10.476121 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:10.477600 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476174 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6756b0b3-8e30-47c8-925b-478ee2126fcc-serviceca\") pod \"node-ca-tbt7s\" (UID: \"6756b0b3-8e30-47c8-925b-478ee2126fcc\") " pod="openshift-image-registry/node-ca-tbt7s" Apr 22 19:06:10.477600 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476120 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a1a4c2e8-7247-484d-9439-dc4d46888d9b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.477600 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:10.476187 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs podName:627cf532-d693-4215-85b9-807d744857ce nodeName:}" failed. No retries permitted until 2026-04-22 19:06:10.976157631 +0000 UTC m=+2.023913828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs") pod "network-metrics-daemon-hvrqj" (UID: "627cf532-d693-4215-85b9-807d744857ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:10.477600 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476223 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-host-var-lib-cni-multus\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.477600 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476246 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6756b0b3-8e30-47c8-925b-478ee2126fcc-host\") pod \"node-ca-tbt7s\" (UID: \"6756b0b3-8e30-47c8-925b-478ee2126fcc\") " pod="openshift-image-registry/node-ca-tbt7s" Apr 22 19:06:10.477600 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476268 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-log-socket\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.477600 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476290 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn8rr\" (UniqueName: \"kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr\") pod \"network-check-target-djgz6\" (UID: \"ac07db1d-d903-453c-9c47-68daab7361ad\") " pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:10.477600 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-etc-sysconfig\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.477600 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476364 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-etc-sysctl-conf\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.477600 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476404 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-cnibin\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.477600 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476434 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db7cc212-874b-4767-a85f-3393efedb1fa-cni-binary-copy\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.478370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476453 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c470388d-c98e-482f-9c89-a240f3abac2d-tmp-dir\") pod \"node-resolver-jhwl6\" (UID: \"c470388d-c98e-482f-9c89-a240f3abac2d\") " pod="openshift-dns/node-resolver-jhwl6" Apr 22 19:06:10.478370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476456 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-host-run-netns\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.478370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476491 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-host-run-netns\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.478370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476541 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/64b28fc4-2a3b-4f1c-8433-a06545cb072a-sys-fs\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.478370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476560 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-host-var-lib-cni-multus\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.478370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476570 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.478370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476566 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a1a4c2e8-7247-484d-9439-dc4d46888d9b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.478370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476598 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6756b0b3-8e30-47c8-925b-478ee2126fcc-host\") pod \"node-ca-tbt7s\" (UID: \"6756b0b3-8e30-47c8-925b-478ee2126fcc\") " pod="openshift-image-registry/node-ca-tbt7s" Apr 22 19:06:10.478370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a1a4c2e8-7247-484d-9439-dc4d46888d9b-cnibin\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.478370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476628 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-etc-modprobe-d\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.478370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476289 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/64b28fc4-2a3b-4f1c-8433-a06545cb072a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.478370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476653 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-etc-systemd\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.478370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476675 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-run\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.478370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476697 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a1a4c2e8-7247-484d-9439-dc4d46888d9b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.478370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476712 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-var-lib-kubelet\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.478370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476760 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-system-cni-dir\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.478370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476771 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-etc-sysctl-conf\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.479164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476630 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-log-socket\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.479164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476798 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-var-lib-openvswitch\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.479164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476831 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-var-lib-openvswitch\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.479164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476844 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-lib-modules\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.479164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476860 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-system-cni-dir\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.479164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtdzl\" (UniqueName: \"kubernetes.io/projected/73fc4728-1f64-4c14-858a-1e77088a0308-kube-api-access-rtdzl\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.479164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476887 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-run\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.479164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476912 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-host-var-lib-kubelet\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.479164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476685 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-etc-sysconfig\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.479164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476910 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-host-var-lib-cni-bin\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.479164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476951 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-etc-systemd\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.479164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476972 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-host-var-lib-cni-bin\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.479164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476968 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/915929b1-126d-4f5a-8427-721f83701d23-host-slash\") pod \"iptables-alerter-c8tcr\" (UID: \"915929b1-126d-4f5a-8427-721f83701d23\") " pod="openshift-network-operator/iptables-alerter-c8tcr" Apr 22 19:06:10.479164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476999 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/915929b1-126d-4f5a-8427-721f83701d23-host-slash\") pod \"iptables-alerter-c8tcr\" (UID: \"915929b1-126d-4f5a-8427-721f83701d23\") " pod="openshift-network-operator/iptables-alerter-c8tcr" Apr 22 19:06:10.479164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.477009 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-var-lib-kubelet\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.479164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.477008 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/64b28fc4-2a3b-4f1c-8433-a06545cb072a-device-dir\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.479164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.477053 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-run-ovn-kubernetes\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.479164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.477896 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-cni-bin\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.479976 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.477948 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9twg\" (UniqueName: \"kubernetes.io/projected/40723a08-baf4-4bac-8032-9853f6f1a2e2-kube-api-access-r9twg\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.479976 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.477980 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzqcc\" (UniqueName: \"kubernetes.io/projected/6756b0b3-8e30-47c8-925b-478ee2126fcc-kube-api-access-kzqcc\") pod \"node-ca-tbt7s\" (UID: \"6756b0b3-8e30-47c8-925b-478ee2126fcc\") " pod="openshift-image-registry/node-ca-tbt7s" Apr 22 19:06:10.479976 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.478012 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-run-systemd\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.479976 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.478043 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-run-openvswitch\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.479976 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.478073 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-run-ovn\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.479976 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.478078 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a1a4c2e8-7247-484d-9439-dc4d46888d9b-cnibin\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.479976 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.478102 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40723a08-baf4-4bac-8032-9853f6f1a2e2-env-overrides\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.479976 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.478178 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-etc-modprobe-d\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.479976 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.478272 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-lib-modules\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.479976 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.476823 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-cnibin\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.479976 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.478534 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40723a08-baf4-4bac-8032-9853f6f1a2e2-env-overrides\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.479976 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.477533 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/64b28fc4-2a3b-4f1c-8433-a06545cb072a-sys-fs\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.479976 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.478611 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-cni-bin\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.479976 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.477670 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/64b28fc4-2a3b-4f1c-8433-a06545cb072a-device-dir\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.479976 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.477801 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-run-ovn-kubernetes\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.479976 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.478926 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40723a08-baf4-4bac-8032-9853f6f1a2e2-ovnkube-script-lib\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.479976 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.477840 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.480660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.479050 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/da0eed59-92c9-4593-b2f3-3ec47cc8c911-agent-certs\") pod \"konnectivity-agent-jr9sz\" (UID: \"da0eed59-92c9-4593-b2f3-3ec47cc8c911\") " pod="kube-system/konnectivity-agent-jr9sz" Apr 22 19:06:10.480660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.479073 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db7cc212-874b-4767-a85f-3393efedb1fa-cni-binary-copy\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.480660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.479132 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/db7cc212-874b-4767-a85f-3393efedb1fa-multus-daemon-config\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.480660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.479134 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-run-openvswitch\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.480660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.479133 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/73fc4728-1f64-4c14-858a-1e77088a0308-tmp\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.480660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.479176 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-host-run-multus-certs\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.480660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.479179 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/73fc4728-1f64-4c14-858a-1e77088a0308-etc-tuned\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.480660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.479226 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-run-ovn\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.480660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.479278 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40723a08-baf4-4bac-8032-9853f6f1a2e2-run-systemd\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.480660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.479253 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-host-run-multus-certs\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.480660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.479344 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjnt8\" (UniqueName: \"kubernetes.io/projected/915929b1-126d-4f5a-8427-721f83701d23-kube-api-access-rjnt8\") pod \"iptables-alerter-c8tcr\" (UID: \"915929b1-126d-4f5a-8427-721f83701d23\") " pod="openshift-network-operator/iptables-alerter-c8tcr" Apr 22 19:06:10.480660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.479380 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40723a08-baf4-4bac-8032-9853f6f1a2e2-ovn-node-metrics-cert\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.480660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.479408 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-host\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.480660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.479449 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-etc-kubernetes\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.480660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.479540 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db7cc212-874b-4767-a85f-3393efedb1fa-etc-kubernetes\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.480660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.479591 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/73fc4728-1f64-4c14-858a-1e77088a0308-host\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.480660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.479671 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/db7cc212-874b-4767-a85f-3393efedb1fa-multus-daemon-config\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.482712 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.482693 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40723a08-baf4-4bac-8032-9853f6f1a2e2-ovn-node-metrics-cert\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.483360 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:10.483342 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:10.483475 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:10.483364 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:10.483475 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:10.483377 2572 projected.go:194] Error preparing data for projected volume kube-api-access-sn8rr for pod openshift-network-diagnostics/network-check-target-djgz6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:10.483475 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:10.483441 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr podName:ac07db1d-d903-453c-9c47-68daab7361ad nodeName:}" failed. No retries permitted until 2026-04-22 19:06:10.983426282 +0000 UTC m=+2.031182476 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-sn8rr" (UniqueName: "kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr") pod "network-check-target-djgz6" (UID: "ac07db1d-d903-453c-9c47-68daab7361ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:10.484412 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.484386 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khccv\" (UniqueName: \"kubernetes.io/projected/db7cc212-874b-4767-a85f-3393efedb1fa-kube-api-access-khccv\") pod \"multus-p7t9t\" (UID: \"db7cc212-874b-4767-a85f-3393efedb1fa\") " pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.484944 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.484902 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krkpc\" (UniqueName: \"kubernetes.io/projected/a1a4c2e8-7247-484d-9439-dc4d46888d9b-kube-api-access-krkpc\") pod \"multus-additional-cni-plugins-ghkl8\" (UID: \"a1a4c2e8-7247-484d-9439-dc4d46888d9b\") " pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.485406 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.485382 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7jjk\" (UniqueName: \"kubernetes.io/projected/627cf532-d693-4215-85b9-807d744857ce-kube-api-access-j7jjk\") pod \"network-metrics-daemon-hvrqj\" (UID: \"627cf532-d693-4215-85b9-807d744857ce\") " pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:10.485551 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.485413 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrgsm\" (UniqueName: \"kubernetes.io/projected/c470388d-c98e-482f-9c89-a240f3abac2d-kube-api-access-hrgsm\") pod \"node-resolver-jhwl6\" (UID: \"c470388d-c98e-482f-9c89-a240f3abac2d\") " pod="openshift-dns/node-resolver-jhwl6" Apr 22 19:06:10.486654 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.486633 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzqcc\" (UniqueName: \"kubernetes.io/projected/6756b0b3-8e30-47c8-925b-478ee2126fcc-kube-api-access-kzqcc\") pod \"node-ca-tbt7s\" (UID: \"6756b0b3-8e30-47c8-925b-478ee2126fcc\") " pod="openshift-image-registry/node-ca-tbt7s" Apr 22 19:06:10.486907 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.486884 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtdzl\" (UniqueName: \"kubernetes.io/projected/73fc4728-1f64-4c14-858a-1e77088a0308-kube-api-access-rtdzl\") pod \"tuned-bl265\" (UID: \"73fc4728-1f64-4c14-858a-1e77088a0308\") " pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.487169 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.487146 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9twg\" (UniqueName: \"kubernetes.io/projected/40723a08-baf4-4bac-8032-9853f6f1a2e2-kube-api-access-r9twg\") pod \"ovnkube-node-v4fm9\" (UID: \"40723a08-baf4-4bac-8032-9853f6f1a2e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.487238 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.487184 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7228j\" (UniqueName: \"kubernetes.io/projected/64b28fc4-2a3b-4f1c-8433-a06545cb072a-kube-api-access-7228j\") pod \"aws-ebs-csi-driver-node-mwpdc\" (UID: \"64b28fc4-2a3b-4f1c-8433-a06545cb072a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.487238 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.487206 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjnt8\" (UniqueName: \"kubernetes.io/projected/915929b1-126d-4f5a-8427-721f83701d23-kube-api-access-rjnt8\") pod \"iptables-alerter-c8tcr\" (UID: \"915929b1-126d-4f5a-8427-721f83701d23\") " pod="openshift-network-operator/iptables-alerter-c8tcr" Apr 22 19:06:10.673402 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.673338 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:06:10.678381 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.678366 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jhwl6" Apr 22 19:06:10.684259 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:10.684237 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc470388d_c98e_482f_9c89_a240f3abac2d.slice/crio-4ddf0e2d5e6def45cbaa3b307567cd4996e60c7748b0584618396a87ff60477d WatchSource:0}: Error finding container 4ddf0e2d5e6def45cbaa3b307567cd4996e60c7748b0584618396a87ff60477d: Status 404 returned error can't find the container with id 4ddf0e2d5e6def45cbaa3b307567cd4996e60c7748b0584618396a87ff60477d Apr 22 19:06:10.692781 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.692763 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ghkl8" Apr 22 19:06:10.698854 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:10.698826 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a4c2e8_7247_484d_9439_dc4d46888d9b.slice/crio-b0105f342ec58548615bb185ff566eaef2000dd68b71066c3fcf6d39863c0025 WatchSource:0}: Error finding container b0105f342ec58548615bb185ff566eaef2000dd68b71066c3fcf6d39863c0025: Status 404 returned error can't find the container with id b0105f342ec58548615bb185ff566eaef2000dd68b71066c3fcf6d39863c0025 Apr 22 19:06:10.702753 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.702737 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-p7t9t" Apr 22 19:06:10.706196 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.706180 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-c8tcr" Apr 22 19:06:10.708233 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:10.708212 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb7cc212_874b_4767_a85f_3393efedb1fa.slice/crio-0a3c12b78d432db78133c175d09bb8de7b8fbec2d1f60df68468afa584e2a054 WatchSource:0}: Error finding container 0a3c12b78d432db78133c175d09bb8de7b8fbec2d1f60df68468afa584e2a054: Status 404 returned error can't find the container with id 0a3c12b78d432db78133c175d09bb8de7b8fbec2d1f60df68468afa584e2a054 Apr 22 19:06:10.712646 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:10.712628 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod915929b1_126d_4f5a_8427_721f83701d23.slice/crio-2512a34c4d13ce077b58d3ccd3a30eb861d69db68284e54d65093ad645ab5130 WatchSource:0}: Error finding container 2512a34c4d13ce077b58d3ccd3a30eb861d69db68284e54d65093ad645ab5130: Status 404 returned error can't find the container with id 2512a34c4d13ce077b58d3ccd3a30eb861d69db68284e54d65093ad645ab5130 Apr 22 19:06:10.738115 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.738095 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jr9sz" Apr 22 19:06:10.743631 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.743611 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" Apr 22 19:06:10.744069 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:10.744046 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda0eed59_92c9_4593_b2f3_3ec47cc8c911.slice/crio-3ae3fd06fe16e2c835d17005964e67e6274bedb56fb415b1063cc3b1c44af7ba WatchSource:0}: Error finding container 3ae3fd06fe16e2c835d17005964e67e6274bedb56fb415b1063cc3b1c44af7ba: Status 404 returned error can't find the container with id 3ae3fd06fe16e2c835d17005964e67e6274bedb56fb415b1063cc3b1c44af7ba Apr 22 19:06:10.748202 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.748047 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bl265" Apr 22 19:06:10.749613 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:10.749593 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64b28fc4_2a3b_4f1c_8433_a06545cb072a.slice/crio-3b647d46744eedcaf7f2c709008a5bc6338ed14b3190985935c6f546c39ccb86 WatchSource:0}: Error finding container 3b647d46744eedcaf7f2c709008a5bc6338ed14b3190985935c6f546c39ccb86: Status 404 returned error can't find the container with id 3b647d46744eedcaf7f2c709008a5bc6338ed14b3190985935c6f546c39ccb86 Apr 22 19:06:10.753449 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:10.753427 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73fc4728_1f64_4c14_858a_1e77088a0308.slice/crio-aa1c1281a3919563ffff7c46624bdb6f951f1e1655f04f943b953a4713c6448b WatchSource:0}: Error finding container aa1c1281a3919563ffff7c46624bdb6f951f1e1655f04f943b953a4713c6448b: Status 404 returned error can't find the container with id aa1c1281a3919563ffff7c46624bdb6f951f1e1655f04f943b953a4713c6448b Apr 22 19:06:10.769423 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.769401 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tbt7s" Apr 22 19:06:10.773970 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.773954 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:10.774283 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:10.774266 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6756b0b3_8e30_47c8_925b_478ee2126fcc.slice/crio-966a48683d955f5fef32dd851330f85c4e60aabb1032b2c86a9d7726c1f63d4c WatchSource:0}: Error finding container 966a48683d955f5fef32dd851330f85c4e60aabb1032b2c86a9d7726c1f63d4c: Status 404 returned error can't find the container with id 966a48683d955f5fef32dd851330f85c4e60aabb1032b2c86a9d7726c1f63d4c Apr 22 19:06:10.983544 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.983424 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs\") pod \"network-metrics-daemon-hvrqj\" (UID: \"627cf532-d693-4215-85b9-807d744857ce\") " pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:10.983544 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:10.983469 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn8rr\" (UniqueName: \"kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr\") pod \"network-check-target-djgz6\" (UID: \"ac07db1d-d903-453c-9c47-68daab7361ad\") " pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:10.983704 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:10.983581 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:10.983704 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:10.983600 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:10.983704 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:10.983612 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:10.983704 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:10.983620 2572 projected.go:194] Error preparing data for projected volume kube-api-access-sn8rr for pod openshift-network-diagnostics/network-check-target-djgz6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:10.983704 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:10.983640 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs podName:627cf532-d693-4215-85b9-807d744857ce nodeName:}" failed. No retries permitted until 2026-04-22 19:06:11.983625547 +0000 UTC m=+3.031381736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs") pod "network-metrics-daemon-hvrqj" (UID: "627cf532-d693-4215-85b9-807d744857ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:10.983704 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:10.983669 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr podName:ac07db1d-d903-453c-9c47-68daab7361ad nodeName:}" failed. No retries permitted until 2026-04-22 19:06:11.983660524 +0000 UTC m=+3.031416708 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-sn8rr" (UniqueName: "kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr") pod "network-check-target-djgz6" (UID: "ac07db1d-d903-453c-9c47-68daab7361ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:11.414234 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:11.414158 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:01:10 +0000 UTC" deadline="2027-10-04 11:50:17.307159412 +0000 UTC" Apr 22 19:06:11.414234 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:11.414196 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12712h44m5.89296775s" Apr 22 19:06:11.447304 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:11.447278 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:06:11.468328 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:11.467733 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:11.468328 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:11.467891 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-djgz6" podUID="ac07db1d-d903-453c-9c47-68daab7361ad" Apr 22 19:06:11.477606 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:11.477530 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bl265" event={"ID":"73fc4728-1f64-4c14-858a-1e77088a0308","Type":"ContainerStarted","Data":"aa1c1281a3919563ffff7c46624bdb6f951f1e1655f04f943b953a4713c6448b"} Apr 22 19:06:11.480064 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:11.480014 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" event={"ID":"64b28fc4-2a3b-4f1c-8433-a06545cb072a","Type":"ContainerStarted","Data":"3b647d46744eedcaf7f2c709008a5bc6338ed14b3190985935c6f546c39ccb86"} Apr 22 19:06:11.483367 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:11.483324 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jr9sz" event={"ID":"da0eed59-92c9-4593-b2f3-3ec47cc8c911","Type":"ContainerStarted","Data":"3ae3fd06fe16e2c835d17005964e67e6274bedb56fb415b1063cc3b1c44af7ba"} Apr 22 19:06:11.484745 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:11.484692 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c8tcr" event={"ID":"915929b1-126d-4f5a-8427-721f83701d23","Type":"ContainerStarted","Data":"2512a34c4d13ce077b58d3ccd3a30eb861d69db68284e54d65093ad645ab5130"} Apr 22 19:06:11.488004 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:11.487980 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ghkl8" event={"ID":"a1a4c2e8-7247-484d-9439-dc4d46888d9b","Type":"ContainerStarted","Data":"b0105f342ec58548615bb185ff566eaef2000dd68b71066c3fcf6d39863c0025"} Apr 22 19:06:11.497700 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:11.496111 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-110.ec2.internal" event={"ID":"54e4518df75bf5ebe24281d874521911","Type":"ContainerStarted","Data":"3bbdfd97fe0860c5ec1c523b0969ba6800523b51f471551f1b5a690dfd845bdb"} Apr 22 19:06:11.502930 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:11.502906 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" event={"ID":"40723a08-baf4-4bac-8032-9853f6f1a2e2","Type":"ContainerStarted","Data":"bf030140b6479fe29b1bc5cf7b417e706d9176d819cc328d5fcfd9d15f569c7b"} Apr 22 19:06:11.507365 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:11.507342 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tbt7s" event={"ID":"6756b0b3-8e30-47c8-925b-478ee2126fcc","Type":"ContainerStarted","Data":"966a48683d955f5fef32dd851330f85c4e60aabb1032b2c86a9d7726c1f63d4c"} Apr 22 19:06:11.515660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:11.513237 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p7t9t" event={"ID":"db7cc212-874b-4767-a85f-3393efedb1fa","Type":"ContainerStarted","Data":"0a3c12b78d432db78133c175d09bb8de7b8fbec2d1f60df68468afa584e2a054"} Apr 22 19:06:11.522963 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:11.522827 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jhwl6" event={"ID":"c470388d-c98e-482f-9c89-a240f3abac2d","Type":"ContainerStarted","Data":"4ddf0e2d5e6def45cbaa3b307567cd4996e60c7748b0584618396a87ff60477d"} Apr 22 19:06:11.525753 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:11.525726 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal" event={"ID":"2759787ebcf5e23cbf021ba7f2970669","Type":"ContainerStarted","Data":"2b3b4a7037377ae042aacec7bf2deb0afbfe99f23e16e863390a4c2b0df7b355"} Apr 22 19:06:11.991504 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:11.991468 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs\") pod \"network-metrics-daemon-hvrqj\" (UID: \"627cf532-d693-4215-85b9-807d744857ce\") " pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:11.991693 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:11.991545 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn8rr\" (UniqueName: \"kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr\") pod \"network-check-target-djgz6\" (UID: \"ac07db1d-d903-453c-9c47-68daab7361ad\") " pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:11.991753 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:11.991709 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:11.991753 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:11.991731 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:11.991753 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:11.991743 2572 projected.go:194] Error preparing data for projected volume kube-api-access-sn8rr for pod openshift-network-diagnostics/network-check-target-djgz6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:11.991909 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:11.991800 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr podName:ac07db1d-d903-453c-9c47-68daab7361ad nodeName:}" failed. No retries permitted until 2026-04-22 19:06:13.991781263 +0000 UTC m=+5.039537450 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-sn8rr" (UniqueName: "kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr") pod "network-check-target-djgz6" (UID: "ac07db1d-d903-453c-9c47-68daab7361ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:11.992199 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:11.992182 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:11.992263 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:11.992237 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs podName:627cf532-d693-4215-85b9-807d744857ce nodeName:}" failed. No retries permitted until 2026-04-22 19:06:13.992221282 +0000 UTC m=+5.039977471 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs") pod "network-metrics-daemon-hvrqj" (UID: "627cf532-d693-4215-85b9-807d744857ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:12.289871 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:12.289643 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:06:12.415076 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:12.414991 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:01:10 +0000 UTC" deadline="2028-01-19 20:37:15.299811528 +0000 UTC" Apr 22 19:06:12.415076 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:12.415022 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15289h31m2.884792642s" Apr 22 19:06:12.455682 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:12.455179 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:12.455682 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:12.455306 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hvrqj" podUID="627cf532-d693-4215-85b9-807d744857ce" Apr 22 19:06:13.457482 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:13.457413 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:13.457967 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:13.457549 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-djgz6" podUID="ac07db1d-d903-453c-9c47-68daab7361ad" Apr 22 19:06:14.009311 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:14.008539 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs\") pod \"network-metrics-daemon-hvrqj\" (UID: \"627cf532-d693-4215-85b9-807d744857ce\") " pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:14.009311 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:14.008596 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn8rr\" (UniqueName: \"kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr\") pod \"network-check-target-djgz6\" (UID: \"ac07db1d-d903-453c-9c47-68daab7361ad\") " pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:14.009311 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:14.008767 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:14.009311 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:14.008787 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:14.009311 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:14.008800 2572 projected.go:194] Error preparing data for projected volume kube-api-access-sn8rr for pod openshift-network-diagnostics/network-check-target-djgz6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:14.009311 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:14.008856 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr podName:ac07db1d-d903-453c-9c47-68daab7361ad nodeName:}" failed. No retries permitted until 2026-04-22 19:06:18.008837127 +0000 UTC m=+9.056593312 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-sn8rr" (UniqueName: "kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr") pod "network-check-target-djgz6" (UID: "ac07db1d-d903-453c-9c47-68daab7361ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:14.009311 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:14.009232 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:14.009311 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:14.009279 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs podName:627cf532-d693-4215-85b9-807d744857ce nodeName:}" failed. No retries permitted until 2026-04-22 19:06:18.009264805 +0000 UTC m=+9.057021005 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs") pod "network-metrics-daemon-hvrqj" (UID: "627cf532-d693-4215-85b9-807d744857ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:14.456228 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:14.455496 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:14.456228 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:14.455659 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hvrqj" podUID="627cf532-d693-4215-85b9-807d744857ce" Apr 22 19:06:15.455099 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:15.455066 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:15.455574 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:15.455213 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-djgz6" podUID="ac07db1d-d903-453c-9c47-68daab7361ad" Apr 22 19:06:16.057363 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:16.057321 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-bz72b"] Apr 22 19:06:16.061000 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:16.060500 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:16.061000 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:16.060589 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bz72b" podUID="e0ff6be1-282c-4ce7-bd85-3383ced78c15" Apr 22 19:06:16.128588 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:16.128556 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret\") pod \"global-pull-secret-syncer-bz72b\" (UID: \"e0ff6be1-282c-4ce7-bd85-3383ced78c15\") " pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:16.128744 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:16.128605 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e0ff6be1-282c-4ce7-bd85-3383ced78c15-kubelet-config\") pod \"global-pull-secret-syncer-bz72b\" (UID: \"e0ff6be1-282c-4ce7-bd85-3383ced78c15\") " pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:16.128744 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:16.128631 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e0ff6be1-282c-4ce7-bd85-3383ced78c15-dbus\") pod \"global-pull-secret-syncer-bz72b\" (UID: \"e0ff6be1-282c-4ce7-bd85-3383ced78c15\") " pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:16.229332 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:16.229296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e0ff6be1-282c-4ce7-bd85-3383ced78c15-kubelet-config\") pod \"global-pull-secret-syncer-bz72b\" (UID: \"e0ff6be1-282c-4ce7-bd85-3383ced78c15\") " pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:16.229473 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:16.229344 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e0ff6be1-282c-4ce7-bd85-3383ced78c15-dbus\") pod \"global-pull-secret-syncer-bz72b\" (UID: \"e0ff6be1-282c-4ce7-bd85-3383ced78c15\") " pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:16.229473 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:16.229445 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret\") pod \"global-pull-secret-syncer-bz72b\" (UID: \"e0ff6be1-282c-4ce7-bd85-3383ced78c15\") " pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:16.229609 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:16.229578 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:16.229666 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:16.229651 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret podName:e0ff6be1-282c-4ce7-bd85-3383ced78c15 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:16.729620092 +0000 UTC m=+7.777376281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret") pod "global-pull-secret-syncer-bz72b" (UID: "e0ff6be1-282c-4ce7-bd85-3383ced78c15") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:16.229722 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:16.229709 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e0ff6be1-282c-4ce7-bd85-3383ced78c15-kubelet-config\") pod \"global-pull-secret-syncer-bz72b\" (UID: \"e0ff6be1-282c-4ce7-bd85-3383ced78c15\") " pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:16.229887 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:16.229858 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e0ff6be1-282c-4ce7-bd85-3383ced78c15-dbus\") pod \"global-pull-secret-syncer-bz72b\" (UID: \"e0ff6be1-282c-4ce7-bd85-3383ced78c15\") " pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:16.455860 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:16.455667 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:16.455860 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:16.455802 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hvrqj" podUID="627cf532-d693-4215-85b9-807d744857ce" Apr 22 19:06:16.734227 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:16.734143 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret\") pod \"global-pull-secret-syncer-bz72b\" (UID: \"e0ff6be1-282c-4ce7-bd85-3383ced78c15\") " pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:16.734386 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:16.734316 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:16.734386 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:16.734384 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret podName:e0ff6be1-282c-4ce7-bd85-3383ced78c15 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:17.734370392 +0000 UTC m=+8.782126581 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret") pod "global-pull-secret-syncer-bz72b" (UID: "e0ff6be1-282c-4ce7-bd85-3383ced78c15") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:17.455351 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:17.455321 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:17.455552 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:17.455333 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:17.455552 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:17.455439 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bz72b" podUID="e0ff6be1-282c-4ce7-bd85-3383ced78c15" Apr 22 19:06:17.455552 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:17.455536 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-djgz6" podUID="ac07db1d-d903-453c-9c47-68daab7361ad" Apr 22 19:06:17.740389 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:17.740298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret\") pod \"global-pull-secret-syncer-bz72b\" (UID: \"e0ff6be1-282c-4ce7-bd85-3383ced78c15\") " pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:17.740812 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:17.740541 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:17.740812 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:17.740611 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret podName:e0ff6be1-282c-4ce7-bd85-3383ced78c15 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:19.740591994 +0000 UTC m=+10.788348195 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret") pod "global-pull-secret-syncer-bz72b" (UID: "e0ff6be1-282c-4ce7-bd85-3383ced78c15") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:18.042162 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:18.042070 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs\") pod \"network-metrics-daemon-hvrqj\" (UID: \"627cf532-d693-4215-85b9-807d744857ce\") " pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:18.042162 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:18.042124 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn8rr\" (UniqueName: \"kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr\") pod \"network-check-target-djgz6\" (UID: \"ac07db1d-d903-453c-9c47-68daab7361ad\") " pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:18.042378 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:18.042235 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:18.042378 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:18.042259 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:18.042378 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:18.042272 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:18.042378 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:18.042285 2572 projected.go:194] Error preparing data for projected volume kube-api-access-sn8rr for pod openshift-network-diagnostics/network-check-target-djgz6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:18.042378 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:18.042308 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs podName:627cf532-d693-4215-85b9-807d744857ce nodeName:}" failed. No retries permitted until 2026-04-22 19:06:26.042288716 +0000 UTC m=+17.090044903 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs") pod "network-metrics-daemon-hvrqj" (UID: "627cf532-d693-4215-85b9-807d744857ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:18.042378 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:18.042329 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr podName:ac07db1d-d903-453c-9c47-68daab7361ad nodeName:}" failed. No retries permitted until 2026-04-22 19:06:26.042316043 +0000 UTC m=+17.090072247 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-sn8rr" (UniqueName: "kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr") pod "network-check-target-djgz6" (UID: "ac07db1d-d903-453c-9c47-68daab7361ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:18.455868 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:18.455783 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:18.456044 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:18.455940 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hvrqj" podUID="627cf532-d693-4215-85b9-807d744857ce" Apr 22 19:06:19.456028 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:19.455996 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:19.456491 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:19.456093 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-djgz6" podUID="ac07db1d-d903-453c-9c47-68daab7361ad" Apr 22 19:06:19.456491 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:19.456156 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:19.456491 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:19.456238 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bz72b" podUID="e0ff6be1-282c-4ce7-bd85-3383ced78c15" Apr 22 19:06:19.755769 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:19.755679 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret\") pod \"global-pull-secret-syncer-bz72b\" (UID: \"e0ff6be1-282c-4ce7-bd85-3383ced78c15\") " pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:19.755915 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:19.755811 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:19.755915 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:19.755873 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret podName:e0ff6be1-282c-4ce7-bd85-3383ced78c15 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:23.755855725 +0000 UTC m=+14.803611914 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret") pod "global-pull-secret-syncer-bz72b" (UID: "e0ff6be1-282c-4ce7-bd85-3383ced78c15") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:20.455380 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:20.455359 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:20.455523 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:20.455492 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hvrqj" podUID="627cf532-d693-4215-85b9-807d744857ce" Apr 22 19:06:21.455841 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:21.455664 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:21.456208 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:21.455659 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:21.456208 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:21.455886 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bz72b" podUID="e0ff6be1-282c-4ce7-bd85-3383ced78c15" Apr 22 19:06:21.456208 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:21.455944 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-djgz6" podUID="ac07db1d-d903-453c-9c47-68daab7361ad" Apr 22 19:06:21.547037 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:21.546884 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ghkl8" event={"ID":"a1a4c2e8-7247-484d-9439-dc4d46888d9b","Type":"ContainerStarted","Data":"d72c74482b034ffb12cb7902ba6ebfe697b645239f8dada610eeeb4518147b58"} Apr 22 19:06:21.548256 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:21.548226 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-110.ec2.internal" event={"ID":"54e4518df75bf5ebe24281d874521911","Type":"ContainerStarted","Data":"b4f39721b93d2506f2f3b77d824c26e0966f4c78ba7f1046a6940bf60fcb36ac"} Apr 22 19:06:21.549296 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:21.549275 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tbt7s" event={"ID":"6756b0b3-8e30-47c8-925b-478ee2126fcc","Type":"ContainerStarted","Data":"dd4538089ce08469a8c1703c8fcfea43f39df0008ffc0c13fafd0bafdb430296"} Apr 22 19:06:21.550298 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:21.550279 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jhwl6" event={"ID":"c470388d-c98e-482f-9c89-a240f3abac2d","Type":"ContainerStarted","Data":"ab71a9f1b6e691bdaf400cbc226534298ebf90372fd3b29d87ed4850990980e5"} Apr 22 19:06:21.551400 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:21.551376 2572 generic.go:358] "Generic (PLEG): container finished" podID="2759787ebcf5e23cbf021ba7f2970669" containerID="f4cc5fcdbf02d766542771d1fda0d63546863a8cb2914ecddc432f6669033538" exitCode=0 Apr 22 19:06:21.551484 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:21.551443 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal" event={"ID":"2759787ebcf5e23cbf021ba7f2970669","Type":"ContainerDied","Data":"f4cc5fcdbf02d766542771d1fda0d63546863a8cb2914ecddc432f6669033538"} Apr 22 19:06:21.552790 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:21.552764 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bl265" event={"ID":"73fc4728-1f64-4c14-858a-1e77088a0308","Type":"ContainerStarted","Data":"f44b7cba03a9b19a824c7bb39cfef11c1635df0af8ac9093c6960dd4b8dabe4e"} Apr 22 19:06:21.554065 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:21.554040 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" event={"ID":"64b28fc4-2a3b-4f1c-8433-a06545cb072a","Type":"ContainerStarted","Data":"30765ec68d4b1357cdb89769868d4ff39612b87e3857842647658e8d2edf8978"} Apr 22 19:06:21.555282 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:21.555264 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jr9sz" event={"ID":"da0eed59-92c9-4593-b2f3-3ec47cc8c911","Type":"ContainerStarted","Data":"12c67ed906e1b3634016affa229d12015aee0a862f25e95eb26755ba6822ff9c"} Apr 22 19:06:21.601435 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:21.601397 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-jr9sz" podStartSLOduration=2.903072014 podStartE2EDuration="12.601386622s" podCreationTimestamp="2026-04-22 19:06:09 +0000 UTC" firstStartedPulling="2026-04-22 19:06:10.746688284 +0000 UTC m=+1.794444468" lastFinishedPulling="2026-04-22 19:06:20.445002879 +0000 UTC m=+11.492759076" observedRunningTime="2026-04-22 19:06:21.600938341 +0000 UTC m=+12.648694548" watchObservedRunningTime="2026-04-22 19:06:21.601386622 +0000 UTC m=+12.649142825" Apr 22 19:06:21.636696 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:21.636526 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bl265" podStartSLOduration=2.934468536 podStartE2EDuration="12.636496408s" podCreationTimestamp="2026-04-22 19:06:09 +0000 UTC" firstStartedPulling="2026-04-22 19:06:10.754879284 +0000 UTC m=+1.802635469" lastFinishedPulling="2026-04-22 19:06:20.456907142 +0000 UTC m=+11.504663341" observedRunningTime="2026-04-22 19:06:21.622412422 +0000 UTC m=+12.670168627" watchObservedRunningTime="2026-04-22 19:06:21.636496408 +0000 UTC m=+12.684252616" Apr 22 19:06:21.653734 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:21.653563 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tbt7s" podStartSLOduration=2.9857491229999997 podStartE2EDuration="12.653552285s" podCreationTimestamp="2026-04-22 19:06:09 +0000 UTC" firstStartedPulling="2026-04-22 19:06:10.775548607 +0000 UTC m=+1.823304794" lastFinishedPulling="2026-04-22 19:06:20.443351761 +0000 UTC m=+11.491107956" observedRunningTime="2026-04-22 19:06:21.635914689 +0000 UTC m=+12.683670895" watchObservedRunningTime="2026-04-22 19:06:21.653552285 +0000 UTC m=+12.701308490" Apr 22 19:06:21.653734 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:21.653715 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-110.ec2.internal" podStartSLOduration=11.653709828 podStartE2EDuration="11.653709828s" podCreationTimestamp="2026-04-22 19:06:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:06:21.653110834 +0000 UTC m=+12.700867041" watchObservedRunningTime="2026-04-22 19:06:21.653709828 +0000 UTC m=+12.701466034" Apr 22 19:06:21.675728 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:21.675690 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jhwl6" podStartSLOduration=2.919984179 podStartE2EDuration="12.6756747s" podCreationTimestamp="2026-04-22 19:06:09 +0000 UTC" firstStartedPulling="2026-04-22 19:06:10.685663815 +0000 UTC m=+1.733419999" lastFinishedPulling="2026-04-22 19:06:20.441354319 +0000 UTC m=+11.489110520" observedRunningTime="2026-04-22 19:06:21.675469139 +0000 UTC m=+12.723225342" watchObservedRunningTime="2026-04-22 19:06:21.6756747 +0000 UTC m=+12.723430904" Apr 22 19:06:22.455302 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:22.455275 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:22.455491 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:22.455385 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hvrqj" podUID="627cf532-d693-4215-85b9-807d744857ce" Apr 22 19:06:22.558789 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:22.558752 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal" event={"ID":"2759787ebcf5e23cbf021ba7f2970669","Type":"ContainerStarted","Data":"6f99efeb19f101c00eed8182a48368b6fbe918d048ef6d495662ac183b5ed9c5"} Apr 22 19:06:22.560911 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:22.560828 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c8tcr" event={"ID":"915929b1-126d-4f5a-8427-721f83701d23","Type":"ContainerStarted","Data":"7452afbe54ee2c1161890a87aaab5aa2478b6c5200cfe5508baf1742fb709c79"} Apr 22 19:06:22.574204 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:22.574169 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-110.ec2.internal" podStartSLOduration=12.574157167 podStartE2EDuration="12.574157167s" podCreationTimestamp="2026-04-22 19:06:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:06:22.573670645 +0000 UTC m=+13.621426852" watchObservedRunningTime="2026-04-22 19:06:22.574157167 +0000 UTC m=+13.621913374" Apr 22 19:06:22.587078 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:22.586892 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-c8tcr" podStartSLOduration=3.858524076 podStartE2EDuration="13.586875908s" podCreationTimestamp="2026-04-22 19:06:09 +0000 UTC" firstStartedPulling="2026-04-22 19:06:10.713835878 +0000 UTC m=+1.761592061" lastFinishedPulling="2026-04-22 19:06:20.442187695 +0000 UTC m=+11.489943893" observedRunningTime="2026-04-22 19:06:22.586352099 +0000 UTC m=+13.634108305" watchObservedRunningTime="2026-04-22 19:06:22.586875908 +0000 UTC m=+13.634632114" Apr 22 19:06:23.455091 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:23.455060 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:23.455264 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:23.455104 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:23.455264 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:23.455179 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-djgz6" podUID="ac07db1d-d903-453c-9c47-68daab7361ad" Apr 22 19:06:23.455384 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:23.455311 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bz72b" podUID="e0ff6be1-282c-4ce7-bd85-3383ced78c15" Apr 22 19:06:23.785225 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:23.785152 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret\") pod \"global-pull-secret-syncer-bz72b\" (UID: \"e0ff6be1-282c-4ce7-bd85-3383ced78c15\") " pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:23.785699 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:23.785264 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:23.785699 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:23.785319 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret podName:e0ff6be1-282c-4ce7-bd85-3383ced78c15 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:31.785301384 +0000 UTC m=+22.833057569 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret") pod "global-pull-secret-syncer-bz72b" (UID: "e0ff6be1-282c-4ce7-bd85-3383ced78c15") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:24.455843 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:24.455811 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:24.455999 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:24.455940 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hvrqj" podUID="627cf532-d693-4215-85b9-807d744857ce" Apr 22 19:06:25.455023 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:25.454939 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:25.455023 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:25.454979 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:25.455531 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:25.455060 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-djgz6" podUID="ac07db1d-d903-453c-9c47-68daab7361ad" Apr 22 19:06:25.455531 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:25.455233 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bz72b" podUID="e0ff6be1-282c-4ce7-bd85-3383ced78c15" Apr 22 19:06:25.566712 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:25.566677 2572 generic.go:358] "Generic (PLEG): container finished" podID="a1a4c2e8-7247-484d-9439-dc4d46888d9b" containerID="d72c74482b034ffb12cb7902ba6ebfe697b645239f8dada610eeeb4518147b58" exitCode=0 Apr 22 19:06:25.566877 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:25.566723 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ghkl8" event={"ID":"a1a4c2e8-7247-484d-9439-dc4d46888d9b","Type":"ContainerDied","Data":"d72c74482b034ffb12cb7902ba6ebfe697b645239f8dada610eeeb4518147b58"} Apr 22 19:06:25.920402 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:25.920372 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-jr9sz" Apr 22 19:06:25.921671 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:25.921649 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-jr9sz" Apr 22 19:06:26.101829 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:26.101799 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs\") pod \"network-metrics-daemon-hvrqj\" (UID: \"627cf532-d693-4215-85b9-807d744857ce\") " pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:26.101977 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:26.101836 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn8rr\" (UniqueName: \"kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr\") pod \"network-check-target-djgz6\" (UID: \"ac07db1d-d903-453c-9c47-68daab7361ad\") " pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:26.102039 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:26.101989 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:26.102039 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:26.102026 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:26.102131 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:26.102046 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:26.102131 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:26.102053 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs podName:627cf532-d693-4215-85b9-807d744857ce nodeName:}" failed. No retries permitted until 2026-04-22 19:06:42.10203747 +0000 UTC m=+33.149793655 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs") pod "network-metrics-daemon-hvrqj" (UID: "627cf532-d693-4215-85b9-807d744857ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:26.102131 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:26.102058 2572 projected.go:194] Error preparing data for projected volume kube-api-access-sn8rr for pod openshift-network-diagnostics/network-check-target-djgz6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:26.102131 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:26.102116 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr podName:ac07db1d-d903-453c-9c47-68daab7361ad nodeName:}" failed. No retries permitted until 2026-04-22 19:06:42.102094351 +0000 UTC m=+33.149850538 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-sn8rr" (UniqueName: "kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr") pod "network-check-target-djgz6" (UID: "ac07db1d-d903-453c-9c47-68daab7361ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:26.454862 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:26.454832 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:26.455022 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:26.454951 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hvrqj" podUID="627cf532-d693-4215-85b9-807d744857ce" Apr 22 19:06:27.454851 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:27.454817 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:27.455239 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:27.454830 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:27.455239 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:27.454925 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bz72b" podUID="e0ff6be1-282c-4ce7-bd85-3383ced78c15" Apr 22 19:06:27.455239 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:27.455016 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-djgz6" podUID="ac07db1d-d903-453c-9c47-68daab7361ad" Apr 22 19:06:28.455846 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:28.455809 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:28.456371 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:28.455909 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hvrqj" podUID="627cf532-d693-4215-85b9-807d744857ce" Apr 22 19:06:29.263827 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:29.263794 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-jr9sz" Apr 22 19:06:29.263992 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:29.263902 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:06:29.264398 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:29.264374 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-jr9sz" Apr 22 19:06:29.455570 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:29.455537 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:29.455723 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:29.455620 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-djgz6" podUID="ac07db1d-d903-453c-9c47-68daab7361ad" Apr 22 19:06:29.455723 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:29.455672 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:29.455826 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:29.455747 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bz72b" podUID="e0ff6be1-282c-4ce7-bd85-3383ced78c15" Apr 22 19:06:29.892440 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:29.892258 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:06:30.439529 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:30.439280 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:06:29.89243903Z","UUID":"89f9daff-a61e-47a4-92c3-48e2c81e6465","Handler":null,"Name":"","Endpoint":""} Apr 22 19:06:30.441131 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:30.441107 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:06:30.441233 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:30.441138 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:06:30.455596 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:30.455572 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:30.455700 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:30.455665 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hvrqj" podUID="627cf532-d693-4215-85b9-807d744857ce" Apr 22 19:06:30.576714 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:30.576683 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p7t9t" event={"ID":"db7cc212-874b-4767-a85f-3393efedb1fa","Type":"ContainerStarted","Data":"76ae58432d1ed5c005fd264e1df8e12be59a76a679120f50141719f5d8c4508d"} Apr 22 19:06:30.579522 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:30.579477 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" event={"ID":"64b28fc4-2a3b-4f1c-8433-a06545cb072a","Type":"ContainerStarted","Data":"a2b8dc24f02b1bc610871644df8bf93bd900c238bc9861c98951926b7f5b995f"} Apr 22 19:06:30.583057 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:30.583031 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" event={"ID":"40723a08-baf4-4bac-8032-9853f6f1a2e2","Type":"ContainerStarted","Data":"48830175b88292c47b422dd6198773a35d6f83c3ee450e560e36a9c4bc24ae4a"} Apr 22 19:06:30.583057 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:30.583063 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" event={"ID":"40723a08-baf4-4bac-8032-9853f6f1a2e2","Type":"ContainerStarted","Data":"3c7a286e9bb60ba6ca9ded2a7a7689e7c907d292639ac83f9eeaebd0889d94e9"} Apr 22 19:06:30.583233 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:30.583077 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" event={"ID":"40723a08-baf4-4bac-8032-9853f6f1a2e2","Type":"ContainerStarted","Data":"7879916e177cb5033b12ab72b41b9fc5e4db8b84780a9751252e23872425cf76"} Apr 22 19:06:30.583233 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:30.583090 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" event={"ID":"40723a08-baf4-4bac-8032-9853f6f1a2e2","Type":"ContainerStarted","Data":"48c9f0193b961f337d613fe41dda5ef86783ee23ec19db34d301df686286720d"} Apr 22 19:06:30.583233 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:30.583104 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" event={"ID":"40723a08-baf4-4bac-8032-9853f6f1a2e2","Type":"ContainerStarted","Data":"4707c44ffe289c34d7fced5d01d8a6005446259c8029e15f267a659c2df0f597"} Apr 22 19:06:30.583233 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:30.583117 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" event={"ID":"40723a08-baf4-4bac-8032-9853f6f1a2e2","Type":"ContainerStarted","Data":"e8055404f79dc14d93447a84d483b5b6f8e3c33f9b5c6a3ba3c6eff4fde6d4d9"} Apr 22 19:06:30.594399 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:30.594355 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-p7t9t" podStartSLOduration=2.604451251 podStartE2EDuration="21.594341107s" podCreationTimestamp="2026-04-22 19:06:09 +0000 UTC" firstStartedPulling="2026-04-22 19:06:10.710040365 +0000 UTC m=+1.757796549" lastFinishedPulling="2026-04-22 19:06:29.699930212 +0000 UTC m=+20.747686405" observedRunningTime="2026-04-22 19:06:30.593655313 +0000 UTC m=+21.641411519" watchObservedRunningTime="2026-04-22 19:06:30.594341107 +0000 UTC m=+21.642097313" Apr 22 19:06:31.455341 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:31.455263 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:31.455909 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:31.455263 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:31.455909 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:31.455383 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bz72b" podUID="e0ff6be1-282c-4ce7-bd85-3383ced78c15" Apr 22 19:06:31.455909 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:31.455421 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-djgz6" podUID="ac07db1d-d903-453c-9c47-68daab7361ad" Apr 22 19:06:31.586847 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:31.586784 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" event={"ID":"64b28fc4-2a3b-4f1c-8433-a06545cb072a","Type":"ContainerStarted","Data":"5a3427cace18327c974001888ca0261ab6db1a3576e29179cb279701234cc765"} Apr 22 19:06:31.604288 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:31.604242 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mwpdc" podStartSLOduration=2.352283618 podStartE2EDuration="22.604231701s" podCreationTimestamp="2026-04-22 19:06:09 +0000 UTC" firstStartedPulling="2026-04-22 19:06:10.751439205 +0000 UTC m=+1.799195398" lastFinishedPulling="2026-04-22 19:06:31.003387297 +0000 UTC m=+22.051143481" observedRunningTime="2026-04-22 19:06:31.604027957 +0000 UTC m=+22.651784163" watchObservedRunningTime="2026-04-22 19:06:31.604231701 +0000 UTC m=+22.651987904" Apr 22 19:06:31.844679 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:31.844648 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret\") pod \"global-pull-secret-syncer-bz72b\" (UID: \"e0ff6be1-282c-4ce7-bd85-3383ced78c15\") " pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:31.844827 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:31.844779 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:31.844876 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:31.844834 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret podName:e0ff6be1-282c-4ce7-bd85-3383ced78c15 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:47.844817533 +0000 UTC m=+38.892573720 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret") pod "global-pull-secret-syncer-bz72b" (UID: "e0ff6be1-282c-4ce7-bd85-3383ced78c15") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:06:32.455667 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:32.455602 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:32.455992 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:32.455706 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hvrqj" podUID="627cf532-d693-4215-85b9-807d744857ce" Apr 22 19:06:32.591590 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:32.591551 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" event={"ID":"40723a08-baf4-4bac-8032-9853f6f1a2e2","Type":"ContainerStarted","Data":"870ed2f54945a0dd2a25dbfdfa0dc52284f25fbf891210f533c4deec1b175826"} Apr 22 19:06:33.455370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:33.455291 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:33.455544 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:33.455297 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:33.455544 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:33.455425 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-djgz6" podUID="ac07db1d-d903-453c-9c47-68daab7361ad" Apr 22 19:06:33.455544 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:33.455522 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bz72b" podUID="e0ff6be1-282c-4ce7-bd85-3383ced78c15" Apr 22 19:06:34.455216 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:34.455184 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:34.455711 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:34.455318 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hvrqj" podUID="627cf532-d693-4215-85b9-807d744857ce" Apr 22 19:06:35.455678 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:35.455411 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:35.455980 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:35.455451 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:35.455980 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:35.455783 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-djgz6" podUID="ac07db1d-d903-453c-9c47-68daab7361ad" Apr 22 19:06:35.455980 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:35.455820 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bz72b" podUID="e0ff6be1-282c-4ce7-bd85-3383ced78c15" Apr 22 19:06:35.598594 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:35.598536 2572 generic.go:358] "Generic (PLEG): container finished" podID="a1a4c2e8-7247-484d-9439-dc4d46888d9b" containerID="341e84b04421a92ca3743c8fae07ddec1ea1fd1d4cb09cc61fbce1c211768e6b" exitCode=0 Apr 22 19:06:35.598712 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:35.598604 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ghkl8" event={"ID":"a1a4c2e8-7247-484d-9439-dc4d46888d9b","Type":"ContainerDied","Data":"341e84b04421a92ca3743c8fae07ddec1ea1fd1d4cb09cc61fbce1c211768e6b"} Apr 22 19:06:35.601843 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:35.601817 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" event={"ID":"40723a08-baf4-4bac-8032-9853f6f1a2e2","Type":"ContainerStarted","Data":"5d5ec7a7e227ebc62cd41cc75f992bcc9420d89f79292dbb9672fc37d5d2d969"} Apr 22 19:06:35.602114 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:35.602090 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:35.602233 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:35.602123 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:35.616225 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:35.616207 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:35.616320 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:35.616268 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:35.646982 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:35.646946 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" podStartSLOduration=7.681937736 podStartE2EDuration="26.646932499s" podCreationTimestamp="2026-04-22 19:06:09 +0000 UTC" firstStartedPulling="2026-04-22 19:06:10.781389434 +0000 UTC m=+1.829145618" lastFinishedPulling="2026-04-22 19:06:29.746384196 +0000 UTC m=+20.794140381" observedRunningTime="2026-04-22 19:06:35.646540286 +0000 UTC m=+26.694296491" watchObservedRunningTime="2026-04-22 19:06:35.646932499 +0000 UTC m=+26.694688705" Apr 22 19:06:36.455259 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:36.455234 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:36.455473 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:36.455345 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hvrqj" podUID="627cf532-d693-4215-85b9-807d744857ce" Apr 22 19:06:36.603245 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:36.603224 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:06:37.005926 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:37.005897 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bz72b"] Apr 22 19:06:37.006100 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:37.005992 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:37.006100 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:37.006089 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bz72b" podUID="e0ff6be1-282c-4ce7-bd85-3383ced78c15" Apr 22 19:06:37.008664 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:37.008642 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-djgz6"] Apr 22 19:06:37.008747 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:37.008712 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:37.008799 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:37.008781 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-djgz6" podUID="ac07db1d-d903-453c-9c47-68daab7361ad" Apr 22 19:06:37.011527 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:37.011489 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hvrqj"] Apr 22 19:06:37.011606 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:37.011576 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:37.011695 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:37.011675 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hvrqj" podUID="627cf532-d693-4215-85b9-807d744857ce" Apr 22 19:06:37.606099 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:37.606069 2572 generic.go:358] "Generic (PLEG): container finished" podID="a1a4c2e8-7247-484d-9439-dc4d46888d9b" containerID="9b135cde17484e021f55042312029715b40186e3820cad35078d90cdc3ca47c7" exitCode=0 Apr 22 19:06:37.606528 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:37.606138 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ghkl8" event={"ID":"a1a4c2e8-7247-484d-9439-dc4d46888d9b","Type":"ContainerDied","Data":"9b135cde17484e021f55042312029715b40186e3820cad35078d90cdc3ca47c7"} Apr 22 19:06:37.606528 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:37.606273 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:06:38.455321 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:38.455262 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:38.455420 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:38.455262 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:38.455420 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:38.455379 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hvrqj" podUID="627cf532-d693-4215-85b9-807d744857ce" Apr 22 19:06:38.455501 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:38.455418 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-djgz6" podUID="ac07db1d-d903-453c-9c47-68daab7361ad" Apr 22 19:06:38.455501 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:38.455279 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:38.455583 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:38.455525 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bz72b" podUID="e0ff6be1-282c-4ce7-bd85-3383ced78c15" Apr 22 19:06:38.610051 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:38.610027 2572 generic.go:358] "Generic (PLEG): container finished" podID="a1a4c2e8-7247-484d-9439-dc4d46888d9b" containerID="144e806978eb578f669b7dae175e3fd75463c6eb686084bedf4a4481faeb2dac" exitCode=0 Apr 22 19:06:38.610356 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:38.610060 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ghkl8" event={"ID":"a1a4c2e8-7247-484d-9439-dc4d46888d9b","Type":"ContainerDied","Data":"144e806978eb578f669b7dae175e3fd75463c6eb686084bedf4a4481faeb2dac"} Apr 22 19:06:40.201163 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:40.200988 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:06:40.201743 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:40.201367 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:06:40.211379 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:40.211326 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" podUID="40723a08-baf4-4bac-8032-9853f6f1a2e2" containerName="ovnkube-controller" probeResult="failure" output="" Apr 22 19:06:40.219478 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:40.219420 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" podUID="40723a08-baf4-4bac-8032-9853f6f1a2e2" containerName="ovnkube-controller" probeResult="failure" output="" Apr 22 19:06:40.455534 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:40.455444 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:40.455534 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:40.455462 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:40.455534 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:40.455444 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:40.455749 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:40.455591 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hvrqj" podUID="627cf532-d693-4215-85b9-807d744857ce" Apr 22 19:06:40.455749 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:40.455638 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bz72b" podUID="e0ff6be1-282c-4ce7-bd85-3383ced78c15" Apr 22 19:06:40.455749 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:40.455708 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-djgz6" podUID="ac07db1d-d903-453c-9c47-68daab7361ad" Apr 22 19:06:42.127146 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.127121 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs\") pod \"network-metrics-daemon-hvrqj\" (UID: \"627cf532-d693-4215-85b9-807d744857ce\") " pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:42.127756 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.127167 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn8rr\" (UniqueName: \"kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr\") pod \"network-check-target-djgz6\" (UID: \"ac07db1d-d903-453c-9c47-68daab7361ad\") " pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:42.127756 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:42.127271 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:42.127756 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:42.127281 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:42.127756 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:42.127293 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:42.127756 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:42.127302 2572 projected.go:194] Error preparing data for projected volume kube-api-access-sn8rr for pod openshift-network-diagnostics/network-check-target-djgz6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:42.127756 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:42.127328 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs podName:627cf532-d693-4215-85b9-807d744857ce nodeName:}" failed. No retries permitted until 2026-04-22 19:07:14.127312961 +0000 UTC m=+65.175069149 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs") pod "network-metrics-daemon-hvrqj" (UID: "627cf532-d693-4215-85b9-807d744857ce") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:42.127756 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:42.127344 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr podName:ac07db1d-d903-453c-9c47-68daab7361ad nodeName:}" failed. No retries permitted until 2026-04-22 19:07:14.127336958 +0000 UTC m=+65.175093145 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-sn8rr" (UniqueName: "kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr") pod "network-check-target-djgz6" (UID: "ac07db1d-d903-453c-9c47-68daab7361ad") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:42.329974 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.329943 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-110.ec2.internal" event="NodeReady" Apr 22 19:06:42.330124 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.330077 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:06:42.365653 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.365586 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5b5687b7fb-dblvq"] Apr 22 19:06:42.421360 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.421314 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b5687b7fb-dblvq"] Apr 22 19:06:42.421360 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.421356 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-s5lsg"] Apr 22 19:06:42.421603 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.421431 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.424019 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.423993 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 19:06:42.424019 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.424001 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 19:06:42.426341 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.425901 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 19:06:42.426341 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.426000 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-px248\"" Apr 22 19:06:42.445466 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.445442 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 19:06:42.457290 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.457272 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4mp4v"] Apr 22 19:06:42.457398 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.457378 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:06:42.457451 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.457394 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:06:42.457496 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.457467 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s5lsg" Apr 22 19:06:42.457566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.457536 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:42.459951 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.459916 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:06:42.460059 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.459983 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:06:42.460059 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.460028 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zjzgr\"" Apr 22 19:06:42.460059 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.460033 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:06:42.460209 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.460088 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:06:42.460602 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.460336 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:06:42.460602 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.460389 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2cl86\"" Apr 22 19:06:42.460602 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.460352 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:06:42.460602 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.460425 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6t9wx\"" Apr 22 19:06:42.481545 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.481505 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s5lsg"] Apr 22 19:06:42.481630 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.481551 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4mp4v"] Apr 22 19:06:42.481630 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.481579 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:06:42.483951 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.483931 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:06:42.484169 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.483957 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:06:42.484169 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.483965 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:06:42.484169 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.484000 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lr5f4\"" Apr 22 19:06:42.530541 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.530502 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.530645 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.530549 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-certificates\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.530645 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.530577 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e92d5fad-0e3c-4252-8c7e-880f065232c1-image-registry-private-configuration\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.530645 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.530604 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e92d5fad-0e3c-4252-8c7e-880f065232c1-trusted-ca\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.530645 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.530628 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-bound-sa-token\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.530840 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.530658 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl4q2\" (UniqueName: \"kubernetes.io/projected/20d45b6d-4197-46bf-bb48-01fcb92e75d7-kube-api-access-kl4q2\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:06:42.530840 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.530707 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20d45b6d-4197-46bf-bb48-01fcb92e75d7-config-volume\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:06:42.530840 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.530734 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbblz\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-kube-api-access-bbblz\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.530840 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.530781 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e92d5fad-0e3c-4252-8c7e-880f065232c1-ca-trust-extracted\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.530967 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.530868 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:06:42.530967 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.530889 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/20d45b6d-4197-46bf-bb48-01fcb92e75d7-tmp-dir\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:06:42.530967 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.530920 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e92d5fad-0e3c-4252-8c7e-880f065232c1-installation-pull-secrets\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.631990 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.631927 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-bound-sa-token\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.631990 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.631970 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kl4q2\" (UniqueName: \"kubernetes.io/projected/20d45b6d-4197-46bf-bb48-01fcb92e75d7-kube-api-access-kl4q2\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:06:42.632147 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.631996 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert\") pod \"ingress-canary-4mp4v\" (UID: \"f941c8a3-428c-47d0-a796-fd116d4256dc\") " pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:06:42.632147 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.632025 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20d45b6d-4197-46bf-bb48-01fcb92e75d7-config-volume\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:06:42.632147 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.632056 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbblz\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-kube-api-access-bbblz\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.632147 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.632083 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvxs4\" (UniqueName: \"kubernetes.io/projected/f941c8a3-428c-47d0-a796-fd116d4256dc-kube-api-access-tvxs4\") pod \"ingress-canary-4mp4v\" (UID: \"f941c8a3-428c-47d0-a796-fd116d4256dc\") " pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:06:42.632147 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.632115 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e92d5fad-0e3c-4252-8c7e-880f065232c1-ca-trust-extracted\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.632390 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.632155 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:06:42.632390 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.632181 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/20d45b6d-4197-46bf-bb48-01fcb92e75d7-tmp-dir\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:06:42.632390 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.632257 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e92d5fad-0e3c-4252-8c7e-880f065232c1-installation-pull-secrets\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.632390 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:42.632300 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:06:42.632390 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.632324 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.632390 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.632351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-certificates\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.632390 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:42.632365 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls podName:20d45b6d-4197-46bf-bb48-01fcb92e75d7 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:43.132345671 +0000 UTC m=+34.180101873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls") pod "dns-default-s5lsg" (UID: "20d45b6d-4197-46bf-bb48-01fcb92e75d7") : secret "dns-default-metrics-tls" not found Apr 22 19:06:42.632723 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.632406 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e92d5fad-0e3c-4252-8c7e-880f065232c1-image-registry-private-configuration\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.632723 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.632460 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e92d5fad-0e3c-4252-8c7e-880f065232c1-trusted-ca\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.632723 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.632466 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e92d5fad-0e3c-4252-8c7e-880f065232c1-ca-trust-extracted\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.632723 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:42.632598 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:06:42.632723 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:42.632613 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b5687b7fb-dblvq: secret "image-registry-tls" not found Apr 22 19:06:42.632723 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:42.632668 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls podName:e92d5fad-0e3c-4252-8c7e-880f065232c1 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:43.132650589 +0000 UTC m=+34.180406785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls") pod "image-registry-5b5687b7fb-dblvq" (UID: "e92d5fad-0e3c-4252-8c7e-880f065232c1") : secret "image-registry-tls" not found Apr 22 19:06:42.632723 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.632711 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20d45b6d-4197-46bf-bb48-01fcb92e75d7-config-volume\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:06:42.632941 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.632758 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/20d45b6d-4197-46bf-bb48-01fcb92e75d7-tmp-dir\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:06:42.644866 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.644842 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-certificates\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.645433 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.645407 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e92d5fad-0e3c-4252-8c7e-880f065232c1-trusted-ca\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.645893 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.645873 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl4q2\" (UniqueName: \"kubernetes.io/projected/20d45b6d-4197-46bf-bb48-01fcb92e75d7-kube-api-access-kl4q2\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:06:42.646716 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.646692 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e92d5fad-0e3c-4252-8c7e-880f065232c1-installation-pull-secrets\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.647322 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.647300 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbblz\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-kube-api-access-bbblz\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.647614 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.647582 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-bound-sa-token\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.648106 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.648079 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e92d5fad-0e3c-4252-8c7e-880f065232c1-image-registry-private-configuration\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:42.733367 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.733341 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert\") pod \"ingress-canary-4mp4v\" (UID: \"f941c8a3-428c-47d0-a796-fd116d4256dc\") " pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:06:42.733486 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.733379 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvxs4\" (UniqueName: \"kubernetes.io/projected/f941c8a3-428c-47d0-a796-fd116d4256dc-kube-api-access-tvxs4\") pod \"ingress-canary-4mp4v\" (UID: \"f941c8a3-428c-47d0-a796-fd116d4256dc\") " pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:06:42.733572 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:42.733492 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:06:42.733631 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:42.733586 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert podName:f941c8a3-428c-47d0-a796-fd116d4256dc nodeName:}" failed. No retries permitted until 2026-04-22 19:06:43.233565179 +0000 UTC m=+34.281321369 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert") pod "ingress-canary-4mp4v" (UID: "f941c8a3-428c-47d0-a796-fd116d4256dc") : secret "canary-serving-cert" not found Apr 22 19:06:42.742706 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:42.742683 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvxs4\" (UniqueName: \"kubernetes.io/projected/f941c8a3-428c-47d0-a796-fd116d4256dc-kube-api-access-tvxs4\") pod \"ingress-canary-4mp4v\" (UID: \"f941c8a3-428c-47d0-a796-fd116d4256dc\") " pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:06:43.135200 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:43.135171 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:43.135571 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:43.135229 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:06:43.135571 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:43.135313 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:06:43.135571 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:43.135317 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:06:43.135571 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:43.135335 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b5687b7fb-dblvq: secret "image-registry-tls" not found Apr 22 19:06:43.135571 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:43.135385 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls podName:e92d5fad-0e3c-4252-8c7e-880f065232c1 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:44.135371783 +0000 UTC m=+35.183127971 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls") pod "image-registry-5b5687b7fb-dblvq" (UID: "e92d5fad-0e3c-4252-8c7e-880f065232c1") : secret "image-registry-tls" not found Apr 22 19:06:43.135571 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:43.135397 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls podName:20d45b6d-4197-46bf-bb48-01fcb92e75d7 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:44.135391702 +0000 UTC m=+35.183147886 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls") pod "dns-default-s5lsg" (UID: "20d45b6d-4197-46bf-bb48-01fcb92e75d7") : secret "dns-default-metrics-tls" not found Apr 22 19:06:43.236259 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:43.236234 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert\") pod \"ingress-canary-4mp4v\" (UID: \"f941c8a3-428c-47d0-a796-fd116d4256dc\") " pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:06:43.236401 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:43.236356 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:06:43.236460 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:43.236416 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert podName:f941c8a3-428c-47d0-a796-fd116d4256dc nodeName:}" failed. No retries permitted until 2026-04-22 19:06:44.236397939 +0000 UTC m=+35.284154137 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert") pod "ingress-canary-4mp4v" (UID: "f941c8a3-428c-47d0-a796-fd116d4256dc") : secret "canary-serving-cert" not found Apr 22 19:06:44.142766 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:44.142545 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:06:44.143198 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:44.142708 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:06:44.143198 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:44.142838 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:44.143198 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:44.142889 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls podName:20d45b6d-4197-46bf-bb48-01fcb92e75d7 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:46.142870564 +0000 UTC m=+37.190626748 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls") pod "dns-default-s5lsg" (UID: "20d45b6d-4197-46bf-bb48-01fcb92e75d7") : secret "dns-default-metrics-tls" not found Apr 22 19:06:44.143198 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:44.142938 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:06:44.143198 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:44.142950 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b5687b7fb-dblvq: secret "image-registry-tls" not found Apr 22 19:06:44.143198 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:44.142997 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls podName:e92d5fad-0e3c-4252-8c7e-880f065232c1 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:46.142984028 +0000 UTC m=+37.190740217 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls") pod "image-registry-5b5687b7fb-dblvq" (UID: "e92d5fad-0e3c-4252-8c7e-880f065232c1") : secret "image-registry-tls" not found Apr 22 19:06:44.243334 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:44.243276 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert\") pod \"ingress-canary-4mp4v\" (UID: \"f941c8a3-428c-47d0-a796-fd116d4256dc\") " pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:06:44.243555 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:44.243473 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:06:44.243629 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:44.243568 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert podName:f941c8a3-428c-47d0-a796-fd116d4256dc nodeName:}" failed. No retries permitted until 2026-04-22 19:06:46.243544636 +0000 UTC m=+37.291300834 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert") pod "ingress-canary-4mp4v" (UID: "f941c8a3-428c-47d0-a796-fd116d4256dc") : secret "canary-serving-cert" not found Apr 22 19:06:46.160105 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:46.160077 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:46.160367 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:46.160144 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:06:46.160367 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:46.160227 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:06:46.160367 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:46.160272 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls podName:20d45b6d-4197-46bf-bb48-01fcb92e75d7 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:50.160258874 +0000 UTC m=+41.208015057 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls") pod "dns-default-s5lsg" (UID: "20d45b6d-4197-46bf-bb48-01fcb92e75d7") : secret "dns-default-metrics-tls" not found Apr 22 19:06:46.160367 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:46.160228 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:06:46.160367 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:46.160318 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b5687b7fb-dblvq: secret "image-registry-tls" not found Apr 22 19:06:46.160367 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:46.160363 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls podName:e92d5fad-0e3c-4252-8c7e-880f065232c1 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:50.160348268 +0000 UTC m=+41.208104458 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls") pod "image-registry-5b5687b7fb-dblvq" (UID: "e92d5fad-0e3c-4252-8c7e-880f065232c1") : secret "image-registry-tls" not found Apr 22 19:06:46.261393 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:46.261371 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert\") pod \"ingress-canary-4mp4v\" (UID: \"f941c8a3-428c-47d0-a796-fd116d4256dc\") " pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:06:46.261545 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:46.261526 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:06:46.261614 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:46.261582 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert podName:f941c8a3-428c-47d0-a796-fd116d4256dc nodeName:}" failed. No retries permitted until 2026-04-22 19:06:50.261568329 +0000 UTC m=+41.309324517 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert") pod "ingress-canary-4mp4v" (UID: "f941c8a3-428c-47d0-a796-fd116d4256dc") : secret "canary-serving-cert" not found Apr 22 19:06:46.627196 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:46.627168 2572 generic.go:358] "Generic (PLEG): container finished" podID="a1a4c2e8-7247-484d-9439-dc4d46888d9b" containerID="0d05042d849a4981e176ad4ae986b9f34897fdb5ab77b507b0d815cdd9fc3d3e" exitCode=0 Apr 22 19:06:46.627290 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:46.627221 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ghkl8" event={"ID":"a1a4c2e8-7247-484d-9439-dc4d46888d9b","Type":"ContainerDied","Data":"0d05042d849a4981e176ad4ae986b9f34897fdb5ab77b507b0d815cdd9fc3d3e"} Apr 22 19:06:47.631476 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:47.631442 2572 generic.go:358] "Generic (PLEG): container finished" podID="a1a4c2e8-7247-484d-9439-dc4d46888d9b" containerID="25dfbc27949ba2cd6503c630aa434183e4e58e20b67e38355a09d3c9ce3cc167" exitCode=0 Apr 22 19:06:47.631875 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:47.631494 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ghkl8" event={"ID":"a1a4c2e8-7247-484d-9439-dc4d46888d9b","Type":"ContainerDied","Data":"25dfbc27949ba2cd6503c630aa434183e4e58e20b67e38355a09d3c9ce3cc167"} Apr 22 19:06:47.873703 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:47.873682 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret\") pod \"global-pull-secret-syncer-bz72b\" (UID: \"e0ff6be1-282c-4ce7-bd85-3383ced78c15\") " pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:47.876894 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:47.876870 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e0ff6be1-282c-4ce7-bd85-3383ced78c15-original-pull-secret\") pod \"global-pull-secret-syncer-bz72b\" (UID: \"e0ff6be1-282c-4ce7-bd85-3383ced78c15\") " pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:48.175994 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:48.175965 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bz72b" Apr 22 19:06:48.347378 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:48.347172 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bz72b"] Apr 22 19:06:48.352311 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:06:48.352282 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ff6be1_282c_4ce7_bd85_3383ced78c15.slice/crio-aa08f95fead54734c977a7fc61771e90a2ab0047c74ff3a12f15dd09aaa69b79 WatchSource:0}: Error finding container aa08f95fead54734c977a7fc61771e90a2ab0047c74ff3a12f15dd09aaa69b79: Status 404 returned error can't find the container with id aa08f95fead54734c977a7fc61771e90a2ab0047c74ff3a12f15dd09aaa69b79 Apr 22 19:06:48.635687 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:48.635645 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ghkl8" event={"ID":"a1a4c2e8-7247-484d-9439-dc4d46888d9b","Type":"ContainerStarted","Data":"9e76e1706c3491c0bc0912ad3eeb09be90298e8d2c7673847d116c9102bf751c"} Apr 22 19:06:48.636499 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:48.636478 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bz72b" event={"ID":"e0ff6be1-282c-4ce7-bd85-3383ced78c15","Type":"ContainerStarted","Data":"aa08f95fead54734c977a7fc61771e90a2ab0047c74ff3a12f15dd09aaa69b79"} Apr 22 19:06:48.660856 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:48.660817 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ghkl8" podStartSLOduration=4.210313399 podStartE2EDuration="39.660806654s" podCreationTimestamp="2026-04-22 19:06:09 +0000 UTC" firstStartedPulling="2026-04-22 19:06:10.700329357 +0000 UTC m=+1.748085541" lastFinishedPulling="2026-04-22 19:06:46.150822612 +0000 UTC m=+37.198578796" observedRunningTime="2026-04-22 19:06:48.659403769 +0000 UTC m=+39.707159975" watchObservedRunningTime="2026-04-22 19:06:48.660806654 +0000 UTC m=+39.708562860" Apr 22 19:06:50.189792 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:50.189761 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:50.190342 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:50.189814 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:06:50.190342 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:50.189903 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:06:50.190342 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:50.189907 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:06:50.190342 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:50.189926 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b5687b7fb-dblvq: secret "image-registry-tls" not found Apr 22 19:06:50.190342 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:50.189950 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls podName:20d45b6d-4197-46bf-bb48-01fcb92e75d7 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:58.189938884 +0000 UTC m=+49.237695073 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls") pod "dns-default-s5lsg" (UID: "20d45b6d-4197-46bf-bb48-01fcb92e75d7") : secret "dns-default-metrics-tls" not found Apr 22 19:06:50.190342 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:50.189962 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls podName:e92d5fad-0e3c-4252-8c7e-880f065232c1 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:58.18995688 +0000 UTC m=+49.237713064 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls") pod "image-registry-5b5687b7fb-dblvq" (UID: "e92d5fad-0e3c-4252-8c7e-880f065232c1") : secret "image-registry-tls" not found Apr 22 19:06:50.290532 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:50.290494 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert\") pod \"ingress-canary-4mp4v\" (UID: \"f941c8a3-428c-47d0-a796-fd116d4256dc\") " pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:06:50.290686 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:50.290644 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:06:50.290749 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:50.290723 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert podName:f941c8a3-428c-47d0-a796-fd116d4256dc nodeName:}" failed. No retries permitted until 2026-04-22 19:06:58.290703722 +0000 UTC m=+49.338459914 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert") pod "ingress-canary-4mp4v" (UID: "f941c8a3-428c-47d0-a796-fd116d4256dc") : secret "canary-serving-cert" not found Apr 22 19:06:54.650269 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:54.650052 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bz72b" event={"ID":"e0ff6be1-282c-4ce7-bd85-3383ced78c15","Type":"ContainerStarted","Data":"81ccaa7bd63d4db89dee58f6ac0215f0097d474642ed9b63e5c3eb66663bcc6f"} Apr 22 19:06:54.667013 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:54.666961 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-bz72b" podStartSLOduration=33.268143784 podStartE2EDuration="38.666948075s" podCreationTimestamp="2026-04-22 19:06:16 +0000 UTC" firstStartedPulling="2026-04-22 19:06:48.353779214 +0000 UTC m=+39.401535398" lastFinishedPulling="2026-04-22 19:06:53.752583501 +0000 UTC m=+44.800339689" observedRunningTime="2026-04-22 19:06:54.666726995 +0000 UTC m=+45.714483202" watchObservedRunningTime="2026-04-22 19:06:54.666948075 +0000 UTC m=+45.714704282" Apr 22 19:06:58.250122 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:58.250086 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:06:58.250473 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:58.250139 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:06:58.250473 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:58.250230 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:06:58.250473 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:58.250236 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:06:58.250473 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:58.250253 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b5687b7fb-dblvq: secret "image-registry-tls" not found Apr 22 19:06:58.250473 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:58.250279 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls podName:20d45b6d-4197-46bf-bb48-01fcb92e75d7 nodeName:}" failed. No retries permitted until 2026-04-22 19:07:14.250266104 +0000 UTC m=+65.298022287 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls") pod "dns-default-s5lsg" (UID: "20d45b6d-4197-46bf-bb48-01fcb92e75d7") : secret "dns-default-metrics-tls" not found Apr 22 19:06:58.250473 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:58.250296 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls podName:e92d5fad-0e3c-4252-8c7e-880f065232c1 nodeName:}" failed. No retries permitted until 2026-04-22 19:07:14.2502842 +0000 UTC m=+65.298040384 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls") pod "image-registry-5b5687b7fb-dblvq" (UID: "e92d5fad-0e3c-4252-8c7e-880f065232c1") : secret "image-registry-tls" not found Apr 22 19:06:58.350903 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:06:58.350877 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert\") pod \"ingress-canary-4mp4v\" (UID: \"f941c8a3-428c-47d0-a796-fd116d4256dc\") " pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:06:58.351021 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:58.351010 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:06:58.351069 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:06:58.351059 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert podName:f941c8a3-428c-47d0-a796-fd116d4256dc nodeName:}" failed. No retries permitted until 2026-04-22 19:07:14.351044643 +0000 UTC m=+65.398800848 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert") pod "ingress-canary-4mp4v" (UID: "f941c8a3-428c-47d0-a796-fd116d4256dc") : secret "canary-serving-cert" not found Apr 22 19:07:10.221339 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:10.221306 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v4fm9" Apr 22 19:07:14.154120 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:14.154071 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs\") pod \"network-metrics-daemon-hvrqj\" (UID: \"627cf532-d693-4215-85b9-807d744857ce\") " pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:07:14.154120 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:14.154123 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn8rr\" (UniqueName: \"kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr\") pod \"network-check-target-djgz6\" (UID: \"ac07db1d-d903-453c-9c47-68daab7361ad\") " pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:07:14.156883 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:14.156861 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:07:14.156989 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:14.156970 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:07:14.164886 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:07:14.164869 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:07:14.164928 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:07:14.164921 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs podName:627cf532-d693-4215-85b9-807d744857ce nodeName:}" failed. No retries permitted until 2026-04-22 19:08:18.164905531 +0000 UTC m=+129.212661716 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs") pod "network-metrics-daemon-hvrqj" (UID: "627cf532-d693-4215-85b9-807d744857ce") : secret "metrics-daemon-secret" not found Apr 22 19:07:14.166612 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:14.166599 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:07:14.178437 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:14.178407 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn8rr\" (UniqueName: \"kubernetes.io/projected/ac07db1d-d903-453c-9c47-68daab7361ad-kube-api-access-sn8rr\") pod \"network-check-target-djgz6\" (UID: \"ac07db1d-d903-453c-9c47-68daab7361ad\") " pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:07:14.255303 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:14.255278 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:07:14.255441 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:14.255349 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:07:14.255483 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:07:14.255443 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:07:14.255483 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:07:14.255460 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b5687b7fb-dblvq: secret "image-registry-tls" not found Apr 22 19:07:14.255563 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:07:14.255448 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:07:14.255563 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:07:14.255531 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls podName:e92d5fad-0e3c-4252-8c7e-880f065232c1 nodeName:}" failed. No retries permitted until 2026-04-22 19:07:46.2554978 +0000 UTC m=+97.303253989 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls") pod "image-registry-5b5687b7fb-dblvq" (UID: "e92d5fad-0e3c-4252-8c7e-880f065232c1") : secret "image-registry-tls" not found Apr 22 19:07:14.255563 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:07:14.255558 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls podName:20d45b6d-4197-46bf-bb48-01fcb92e75d7 nodeName:}" failed. No retries permitted until 2026-04-22 19:07:46.255547662 +0000 UTC m=+97.303303853 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls") pod "dns-default-s5lsg" (UID: "20d45b6d-4197-46bf-bb48-01fcb92e75d7") : secret "dns-default-metrics-tls" not found Apr 22 19:07:14.289959 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:14.289937 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6t9wx\"" Apr 22 19:07:14.297985 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:14.297968 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:07:14.356530 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:14.355952 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert\") pod \"ingress-canary-4mp4v\" (UID: \"f941c8a3-428c-47d0-a796-fd116d4256dc\") " pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:07:14.356530 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:07:14.356173 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:07:14.356530 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:07:14.356235 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert podName:f941c8a3-428c-47d0-a796-fd116d4256dc nodeName:}" failed. No retries permitted until 2026-04-22 19:07:46.356217679 +0000 UTC m=+97.403973866 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert") pod "ingress-canary-4mp4v" (UID: "f941c8a3-428c-47d0-a796-fd116d4256dc") : secret "canary-serving-cert" not found Apr 22 19:07:14.428618 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:14.428553 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-djgz6"] Apr 22 19:07:14.432155 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:07:14.432127 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac07db1d_d903_453c_9c47_68daab7361ad.slice/crio-df2f8fbf54dd9b9d2e61ba7432ab096f0ede93f1feb7daff0ee79da574976fb2 WatchSource:0}: Error finding container df2f8fbf54dd9b9d2e61ba7432ab096f0ede93f1feb7daff0ee79da574976fb2: Status 404 returned error can't find the container with id df2f8fbf54dd9b9d2e61ba7432ab096f0ede93f1feb7daff0ee79da574976fb2 Apr 22 19:07:14.685245 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:14.685173 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-djgz6" event={"ID":"ac07db1d-d903-453c-9c47-68daab7361ad","Type":"ContainerStarted","Data":"df2f8fbf54dd9b9d2e61ba7432ab096f0ede93f1feb7daff0ee79da574976fb2"} Apr 22 19:07:17.692238 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:17.692212 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-djgz6" event={"ID":"ac07db1d-d903-453c-9c47-68daab7361ad","Type":"ContainerStarted","Data":"799636a72d2d4b04db012a3ee5e1d1bbb05b20335c80c685a97570ac447f6295"} Apr 22 19:07:17.692571 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:17.692326 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:07:17.708176 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:17.708130 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-djgz6" podStartSLOduration=65.670402459 podStartE2EDuration="1m8.708117436s" podCreationTimestamp="2026-04-22 19:06:09 +0000 UTC" firstStartedPulling="2026-04-22 19:07:14.434352702 +0000 UTC m=+65.482108888" lastFinishedPulling="2026-04-22 19:07:17.472067675 +0000 UTC m=+68.519823865" observedRunningTime="2026-04-22 19:07:17.70755335 +0000 UTC m=+68.755309567" watchObservedRunningTime="2026-04-22 19:07:17.708117436 +0000 UTC m=+68.755873620" Apr 22 19:07:46.269290 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:46.269171 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:07:46.269290 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:46.269273 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:07:46.269784 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:07:46.269314 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:07:46.269784 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:07:46.269331 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b5687b7fb-dblvq: secret "image-registry-tls" not found Apr 22 19:07:46.269784 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:07:46.269373 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:07:46.269784 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:07:46.269385 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls podName:e92d5fad-0e3c-4252-8c7e-880f065232c1 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:50.269369863 +0000 UTC m=+161.317126047 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls") pod "image-registry-5b5687b7fb-dblvq" (UID: "e92d5fad-0e3c-4252-8c7e-880f065232c1") : secret "image-registry-tls" not found Apr 22 19:07:46.269784 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:07:46.269420 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls podName:20d45b6d-4197-46bf-bb48-01fcb92e75d7 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:50.269409598 +0000 UTC m=+161.317165783 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls") pod "dns-default-s5lsg" (UID: "20d45b6d-4197-46bf-bb48-01fcb92e75d7") : secret "dns-default-metrics-tls" not found Apr 22 19:07:46.369668 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:46.369641 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert\") pod \"ingress-canary-4mp4v\" (UID: \"f941c8a3-428c-47d0-a796-fd116d4256dc\") " pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:07:46.369790 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:07:46.369740 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:07:46.369790 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:07:46.369776 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert podName:f941c8a3-428c-47d0-a796-fd116d4256dc nodeName:}" failed. No retries permitted until 2026-04-22 19:08:50.369765981 +0000 UTC m=+161.417522165 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert") pod "ingress-canary-4mp4v" (UID: "f941c8a3-428c-47d0-a796-fd116d4256dc") : secret "canary-serving-cert" not found Apr 22 19:07:48.696151 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:07:48.696113 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-djgz6" Apr 22 19:08:18.186748 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:18.186705 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs\") pod \"network-metrics-daemon-hvrqj\" (UID: \"627cf532-d693-4215-85b9-807d744857ce\") " pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:08:18.187203 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:18.186844 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:08:18.187203 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:18.186898 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs podName:627cf532-d693-4215-85b9-807d744857ce nodeName:}" failed. No retries permitted until 2026-04-22 19:10:20.186884891 +0000 UTC m=+251.234641080 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs") pod "network-metrics-daemon-hvrqj" (UID: "627cf532-d693-4215-85b9-807d744857ce") : secret "metrics-daemon-secret" not found Apr 22 19:08:42.163944 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.163908 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ldws8"] Apr 22 19:08:42.165564 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.165548 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ldws8" Apr 22 19:08:42.167858 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.167828 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh"] Apr 22 19:08:42.168498 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.168476 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-f2nbz\"" Apr 22 19:08:42.168934 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.168917 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 19:08:42.168934 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.168928 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:08:42.169602 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.169586 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877"] Apr 22 19:08:42.169737 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.169721 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh" Apr 22 19:08:42.171128 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.171112 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-fc6bc5684-m58jn"] Apr 22 19:08:42.171235 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.171221 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" Apr 22 19:08:42.172940 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.172788 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 19:08:42.173012 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.172945 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 19:08:42.173069 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.173018 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:42.173245 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.173224 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:08:42.173413 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.173394 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:08:42.173489 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.173403 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-25v95\"" Apr 22 19:08:42.174646 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.174626 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 19:08:42.174744 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.174666 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 19:08:42.174744 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.174666 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:08:42.174744 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.174722 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-2xntp\"" Apr 22 19:08:42.175782 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.175764 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 19:08:42.176018 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.175772 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 19:08:42.176674 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.176655 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ldws8"] Apr 22 19:08:42.176754 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.176690 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-wkk4m\"" Apr 22 19:08:42.179351 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.179333 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 19:08:42.179449 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.179359 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 19:08:42.179449 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.179336 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 19:08:42.179673 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.179656 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 19:08:42.192364 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.192338 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877"] Apr 22 19:08:42.193451 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.193417 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh"] Apr 22 19:08:42.196047 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.196028 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-fc6bc5684-m58jn"] Apr 22 19:08:42.244186 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.244157 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8mgr\" (UniqueName: \"kubernetes.io/projected/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-kube-api-access-l8mgr\") pod \"cluster-monitoring-operator-75587bd455-s8877\" (UID: \"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" Apr 22 19:08:42.244186 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.244187 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-stats-auth\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:42.244361 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.244218 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-s8877\" (UID: \"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" Apr 22 19:08:42.244361 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.244244 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-default-certificate\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:42.244361 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.244262 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:42.244361 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.244288 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh5dx\" (UniqueName: \"kubernetes.io/projected/94ebe646-1062-42b2-ba8b-a73a0b60e0f6-kube-api-access-sh5dx\") pod \"volume-data-source-validator-7c6cbb6c87-ldws8\" (UID: \"94ebe646-1062-42b2-ba8b-a73a0b60e0f6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ldws8" Apr 22 19:08:42.244361 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.244329 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rjggh\" (UID: \"1d66a3f8-13ef-4828-8abb-f0982f4e7a97\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh" Apr 22 19:08:42.244550 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.244379 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mc6v\" (UniqueName: \"kubernetes.io/projected/763efc39-e846-49c8-8d1f-df055b7efff8-kube-api-access-7mc6v\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:42.244550 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.244407 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-s8877\" (UID: \"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" Apr 22 19:08:42.244550 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.244425 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:42.244550 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.244459 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k4h9\" (UniqueName: \"kubernetes.io/projected/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-kube-api-access-2k4h9\") pod \"cluster-samples-operator-6dc5bdb6b4-rjggh\" (UID: \"1d66a3f8-13ef-4828-8abb-f0982f4e7a97\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh" Apr 22 19:08:42.267727 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.267700 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-jkvx4"] Apr 22 19:08:42.269490 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.269473 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jkvx4" Apr 22 19:08:42.272031 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.272008 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-nwqv5\"" Apr 22 19:08:42.295725 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.295703 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-jkvx4"] Apr 22 19:08:42.344913 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.344883 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2k4h9\" (UniqueName: \"kubernetes.io/projected/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-kube-api-access-2k4h9\") pod \"cluster-samples-operator-6dc5bdb6b4-rjggh\" (UID: \"1d66a3f8-13ef-4828-8abb-f0982f4e7a97\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh" Apr 22 19:08:42.345039 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.344950 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8mgr\" (UniqueName: \"kubernetes.io/projected/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-kube-api-access-l8mgr\") pod \"cluster-monitoring-operator-75587bd455-s8877\" (UID: \"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" Apr 22 19:08:42.345039 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.344976 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-stats-auth\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:42.345039 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.345031 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpzns\" (UniqueName: \"kubernetes.io/projected/db73c2bb-358b-42ee-b00b-b1c586f50162-kube-api-access-gpzns\") pod \"network-check-source-8894fc9bd-jkvx4\" (UID: \"db73c2bb-358b-42ee-b00b-b1c586f50162\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jkvx4" Apr 22 19:08:42.345171 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.345059 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-s8877\" (UID: \"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" Apr 22 19:08:42.345171 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.345084 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-default-certificate\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:42.345171 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.345106 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:42.345171 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.345133 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sh5dx\" (UniqueName: \"kubernetes.io/projected/94ebe646-1062-42b2-ba8b-a73a0b60e0f6-kube-api-access-sh5dx\") pod \"volume-data-source-validator-7c6cbb6c87-ldws8\" (UID: \"94ebe646-1062-42b2-ba8b-a73a0b60e0f6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ldws8" Apr 22 19:08:42.345171 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.345158 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rjggh\" (UID: \"1d66a3f8-13ef-4828-8abb-f0982f4e7a97\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh" Apr 22 19:08:42.345467 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.345203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mc6v\" (UniqueName: \"kubernetes.io/projected/763efc39-e846-49c8-8d1f-df055b7efff8-kube-api-access-7mc6v\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:42.345467 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:42.345232 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle podName:763efc39-e846-49c8-8d1f-df055b7efff8 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:42.845212758 +0000 UTC m=+153.892968944 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle") pod "router-default-fc6bc5684-m58jn" (UID: "763efc39-e846-49c8-8d1f-df055b7efff8") : configmap references non-existent config key: service-ca.crt Apr 22 19:08:42.345467 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.345295 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-s8877\" (UID: \"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" Apr 22 19:08:42.345467 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.345326 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:42.345467 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:42.345399 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:08:42.345467 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:42.345424 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:08:42.345828 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:42.345475 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:08:42.345828 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:42.345482 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-samples-operator-tls podName:1d66a3f8-13ef-4828-8abb-f0982f4e7a97 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:42.845464744 +0000 UTC m=+153.893220940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rjggh" (UID: "1d66a3f8-13ef-4828-8abb-f0982f4e7a97") : secret "samples-operator-tls" not found Apr 22 19:08:42.345828 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:42.345501 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs podName:763efc39-e846-49c8-8d1f-df055b7efff8 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:42.845493634 +0000 UTC m=+153.893249836 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs") pod "router-default-fc6bc5684-m58jn" (UID: "763efc39-e846-49c8-8d1f-df055b7efff8") : secret "router-metrics-certs-default" not found Apr 22 19:08:42.345828 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:42.345561 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls podName:9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:42.845507155 +0000 UTC m=+153.893263354 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-s8877" (UID: "9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:08:42.346021 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.345937 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-s8877\" (UID: \"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" Apr 22 19:08:42.347595 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.347576 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-stats-auth\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:42.348102 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.348076 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-default-certificate\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:42.363673 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.363651 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr"] Apr 22 19:08:42.363879 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.363835 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k4h9\" (UniqueName: \"kubernetes.io/projected/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-kube-api-access-2k4h9\") pod \"cluster-samples-operator-6dc5bdb6b4-rjggh\" (UID: \"1d66a3f8-13ef-4828-8abb-f0982f4e7a97\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh" Apr 22 19:08:42.364149 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.364122 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8mgr\" (UniqueName: \"kubernetes.io/projected/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-kube-api-access-l8mgr\") pod \"cluster-monitoring-operator-75587bd455-s8877\" (UID: \"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" Apr 22 19:08:42.364354 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.364332 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh5dx\" (UniqueName: \"kubernetes.io/projected/94ebe646-1062-42b2-ba8b-a73a0b60e0f6-kube-api-access-sh5dx\") pod \"volume-data-source-validator-7c6cbb6c87-ldws8\" (UID: \"94ebe646-1062-42b2-ba8b-a73a0b60e0f6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ldws8" Apr 22 19:08:42.364428 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.364378 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mc6v\" (UniqueName: \"kubernetes.io/projected/763efc39-e846-49c8-8d1f-df055b7efff8-kube-api-access-7mc6v\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:42.366138 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.366122 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr" Apr 22 19:08:42.368564 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.368544 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-rz64w\"" Apr 22 19:08:42.368680 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.368544 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 19:08:42.369122 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.369107 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 19:08:42.374933 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.374916 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr"] Apr 22 19:08:42.445847 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.445795 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b9dd00bb-c13a-4382-a816-dc9639b9e184-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-sx5qr\" (UID: \"b9dd00bb-c13a-4382-a816-dc9639b9e184\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr" Apr 22 19:08:42.445929 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.445861 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sx5qr\" (UID: \"b9dd00bb-c13a-4382-a816-dc9639b9e184\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr" Apr 22 19:08:42.445929 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.445895 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpzns\" (UniqueName: \"kubernetes.io/projected/db73c2bb-358b-42ee-b00b-b1c586f50162-kube-api-access-gpzns\") pod \"network-check-source-8894fc9bd-jkvx4\" (UID: \"db73c2bb-358b-42ee-b00b-b1c586f50162\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jkvx4" Apr 22 19:08:42.453900 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.453882 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpzns\" (UniqueName: \"kubernetes.io/projected/db73c2bb-358b-42ee-b00b-b1c586f50162-kube-api-access-gpzns\") pod \"network-check-source-8894fc9bd-jkvx4\" (UID: \"db73c2bb-358b-42ee-b00b-b1c586f50162\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jkvx4" Apr 22 19:08:42.476827 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.476808 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ldws8" Apr 22 19:08:42.547440 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.547013 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sx5qr\" (UID: \"b9dd00bb-c13a-4382-a816-dc9639b9e184\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr" Apr 22 19:08:42.547440 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.547101 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b9dd00bb-c13a-4382-a816-dc9639b9e184-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-sx5qr\" (UID: \"b9dd00bb-c13a-4382-a816-dc9639b9e184\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr" Apr 22 19:08:42.547440 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:42.547242 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:08:42.547440 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:42.547343 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert podName:b9dd00bb-c13a-4382-a816-dc9639b9e184 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:43.047318419 +0000 UTC m=+154.095074615 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-sx5qr" (UID: "b9dd00bb-c13a-4382-a816-dc9639b9e184") : secret "networking-console-plugin-cert" not found Apr 22 19:08:42.547924 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.547850 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b9dd00bb-c13a-4382-a816-dc9639b9e184-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-sx5qr\" (UID: \"b9dd00bb-c13a-4382-a816-dc9639b9e184\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr" Apr 22 19:08:42.578221 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.578199 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jkvx4" Apr 22 19:08:42.585386 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.585366 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ldws8"] Apr 22 19:08:42.588421 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:08:42.588394 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94ebe646_1062_42b2_ba8b_a73a0b60e0f6.slice/crio-f77a2bc4eb3a3560b348eb0a2ac82dea56a3eb9dc5426c0e09e94ff50f0a67d7 WatchSource:0}: Error finding container f77a2bc4eb3a3560b348eb0a2ac82dea56a3eb9dc5426c0e09e94ff50f0a67d7: Status 404 returned error can't find the container with id f77a2bc4eb3a3560b348eb0a2ac82dea56a3eb9dc5426c0e09e94ff50f0a67d7 Apr 22 19:08:42.687026 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.686998 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-jkvx4"] Apr 22 19:08:42.689853 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:08:42.689828 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb73c2bb_358b_42ee_b00b_b1c586f50162.slice/crio-5e9e589229fc44e05837c635c97bc5177422fca97af7ca72399454e3baa7b42c WatchSource:0}: Error finding container 5e9e589229fc44e05837c635c97bc5177422fca97af7ca72399454e3baa7b42c: Status 404 returned error can't find the container with id 5e9e589229fc44e05837c635c97bc5177422fca97af7ca72399454e3baa7b42c Apr 22 19:08:42.836404 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.836371 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jkvx4" event={"ID":"db73c2bb-358b-42ee-b00b-b1c586f50162","Type":"ContainerStarted","Data":"ecd487d972a1b0e5582367a9348afa088493a4b0fa2e0ede22c7f3f2c87a1028"} Apr 22 19:08:42.836566 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.836412 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jkvx4" event={"ID":"db73c2bb-358b-42ee-b00b-b1c586f50162","Type":"ContainerStarted","Data":"5e9e589229fc44e05837c635c97bc5177422fca97af7ca72399454e3baa7b42c"} Apr 22 19:08:42.837373 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.837348 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ldws8" event={"ID":"94ebe646-1062-42b2-ba8b-a73a0b60e0f6","Type":"ContainerStarted","Data":"f77a2bc4eb3a3560b348eb0a2ac82dea56a3eb9dc5426c0e09e94ff50f0a67d7"} Apr 22 19:08:42.849284 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.849264 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-s8877\" (UID: \"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" Apr 22 19:08:42.849431 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:42.849414 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:08:42.849488 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.849450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:42.849488 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:42.849474 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls podName:9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:43.84945686 +0000 UTC m=+154.897213049 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-s8877" (UID: "9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:08:42.849634 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.849538 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rjggh\" (UID: \"1d66a3f8-13ef-4828-8abb-f0982f4e7a97\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh" Apr 22 19:08:42.849634 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:42.849557 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:08:42.849634 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.849586 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:42.849634 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:42.849606 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs podName:763efc39-e846-49c8-8d1f-df055b7efff8 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:43.84959053 +0000 UTC m=+154.897346738 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs") pod "router-default-fc6bc5684-m58jn" (UID: "763efc39-e846-49c8-8d1f-df055b7efff8") : secret "router-metrics-certs-default" not found Apr 22 19:08:42.849781 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:42.849645 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:08:42.849781 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:42.849661 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle podName:763efc39-e846-49c8-8d1f-df055b7efff8 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:43.849650059 +0000 UTC m=+154.897406247 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle") pod "router-default-fc6bc5684-m58jn" (UID: "763efc39-e846-49c8-8d1f-df055b7efff8") : configmap references non-existent config key: service-ca.crt Apr 22 19:08:42.849781 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:42.849687 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-samples-operator-tls podName:1d66a3f8-13ef-4828-8abb-f0982f4e7a97 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:43.849672094 +0000 UTC m=+154.897428278 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rjggh" (UID: "1d66a3f8-13ef-4828-8abb-f0982f4e7a97") : secret "samples-operator-tls" not found Apr 22 19:08:42.854852 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:42.854813 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jkvx4" podStartSLOduration=0.854798779 podStartE2EDuration="854.798779ms" podCreationTimestamp="2026-04-22 19:08:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:08:42.853660205 +0000 UTC m=+153.901416411" watchObservedRunningTime="2026-04-22 19:08:42.854798779 +0000 UTC m=+153.902554988" Apr 22 19:08:43.050792 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:43.050763 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sx5qr\" (UID: \"b9dd00bb-c13a-4382-a816-dc9639b9e184\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr" Apr 22 19:08:43.050917 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:43.050887 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:08:43.050970 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:43.050947 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert podName:b9dd00bb-c13a-4382-a816-dc9639b9e184 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:44.050924399 +0000 UTC m=+155.098680583 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-sx5qr" (UID: "b9dd00bb-c13a-4382-a816-dc9639b9e184") : secret "networking-console-plugin-cert" not found Apr 22 19:08:43.857673 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:43.857643 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rjggh\" (UID: \"1d66a3f8-13ef-4828-8abb-f0982f4e7a97\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh" Apr 22 19:08:43.858122 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:43.857697 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:43.858122 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:43.857732 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-s8877\" (UID: \"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" Apr 22 19:08:43.858122 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:43.857813 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:43.858122 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:43.857816 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:08:43.858122 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:43.857887 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:08:43.858122 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:43.857888 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-samples-operator-tls podName:1d66a3f8-13ef-4828-8abb-f0982f4e7a97 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:45.857868183 +0000 UTC m=+156.905624368 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rjggh" (UID: "1d66a3f8-13ef-4828-8abb-f0982f4e7a97") : secret "samples-operator-tls" not found Apr 22 19:08:43.858122 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:43.857928 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle podName:763efc39-e846-49c8-8d1f-df055b7efff8 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:45.857917228 +0000 UTC m=+156.905673412 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle") pod "router-default-fc6bc5684-m58jn" (UID: "763efc39-e846-49c8-8d1f-df055b7efff8") : configmap references non-existent config key: service-ca.crt Apr 22 19:08:43.858122 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:43.857938 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs podName:763efc39-e846-49c8-8d1f-df055b7efff8 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:45.857932397 +0000 UTC m=+156.905688582 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs") pod "router-default-fc6bc5684-m58jn" (UID: "763efc39-e846-49c8-8d1f-df055b7efff8") : secret "router-metrics-certs-default" not found Apr 22 19:08:43.858122 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:43.857949 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:08:43.858122 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:43.858011 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls podName:9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:45.857995396 +0000 UTC m=+156.905751582 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-s8877" (UID: "9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:08:44.058684 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:44.058649 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sx5qr\" (UID: \"b9dd00bb-c13a-4382-a816-dc9639b9e184\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr" Apr 22 19:08:44.058821 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:44.058788 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:08:44.058870 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:44.058844 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert podName:b9dd00bb-c13a-4382-a816-dc9639b9e184 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:46.058829244 +0000 UTC m=+157.106585428 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-sx5qr" (UID: "b9dd00bb-c13a-4382-a816-dc9639b9e184") : secret "networking-console-plugin-cert" not found Apr 22 19:08:44.843168 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:44.843133 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ldws8" event={"ID":"94ebe646-1062-42b2-ba8b-a73a0b60e0f6","Type":"ContainerStarted","Data":"04f005cdbdc9b4f10d328d471cfb973014d4594979eb240cef51262a4e7ef1a0"} Apr 22 19:08:44.859859 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:44.859818 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ldws8" podStartSLOduration=1.558722095 podStartE2EDuration="2.859805351s" podCreationTimestamp="2026-04-22 19:08:42 +0000 UTC" firstStartedPulling="2026-04-22 19:08:42.590071901 +0000 UTC m=+153.637828085" lastFinishedPulling="2026-04-22 19:08:43.89115514 +0000 UTC m=+154.938911341" observedRunningTime="2026-04-22 19:08:44.85934061 +0000 UTC m=+155.907096815" watchObservedRunningTime="2026-04-22 19:08:44.859805351 +0000 UTC m=+155.907561556" Apr 22 19:08:45.434852 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:45.434801 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" podUID="e92d5fad-0e3c-4252-8c7e-880f065232c1" Apr 22 19:08:45.469307 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:45.469278 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-hvrqj" podUID="627cf532-d693-4215-85b9-807d744857ce" Apr 22 19:08:45.481418 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:45.481386 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-s5lsg" podUID="20d45b6d-4197-46bf-bb48-01fcb92e75d7" Apr 22 19:08:45.491805 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:45.491788 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-4mp4v" podUID="f941c8a3-428c-47d0-a796-fd116d4256dc" Apr 22 19:08:45.844933 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:45.844907 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:08:45.845079 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:45.844907 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:08:45.845079 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:45.844917 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s5lsg" Apr 22 19:08:45.869633 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:45.869611 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:45.869971 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:45.869644 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rjggh\" (UID: \"1d66a3f8-13ef-4828-8abb-f0982f4e7a97\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh" Apr 22 19:08:45.869971 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:45.869682 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:45.869971 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:45.869729 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-s8877\" (UID: \"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" Apr 22 19:08:45.869971 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:45.869741 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:08:45.869971 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:45.869746 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:08:45.869971 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:45.869792 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs podName:763efc39-e846-49c8-8d1f-df055b7efff8 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:49.869777753 +0000 UTC m=+160.917533938 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs") pod "router-default-fc6bc5684-m58jn" (UID: "763efc39-e846-49c8-8d1f-df055b7efff8") : secret "router-metrics-certs-default" not found Apr 22 19:08:45.869971 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:45.869844 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle podName:763efc39-e846-49c8-8d1f-df055b7efff8 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:49.869826175 +0000 UTC m=+160.917582367 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle") pod "router-default-fc6bc5684-m58jn" (UID: "763efc39-e846-49c8-8d1f-df055b7efff8") : configmap references non-existent config key: service-ca.crt Apr 22 19:08:45.869971 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:45.869844 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:08:45.869971 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:45.869862 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-samples-operator-tls podName:1d66a3f8-13ef-4828-8abb-f0982f4e7a97 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:49.869855775 +0000 UTC m=+160.917611959 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rjggh" (UID: "1d66a3f8-13ef-4828-8abb-f0982f4e7a97") : secret "samples-operator-tls" not found Apr 22 19:08:45.869971 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:45.869913 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls podName:9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:49.86989849 +0000 UTC m=+160.917654691 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-s8877" (UID: "9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:08:46.072473 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:46.072439 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sx5qr\" (UID: \"b9dd00bb-c13a-4382-a816-dc9639b9e184\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr" Apr 22 19:08:46.072620 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:46.072565 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:08:46.072663 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:46.072631 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert podName:b9dd00bb-c13a-4382-a816-dc9639b9e184 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:50.072616157 +0000 UTC m=+161.120372341 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-sx5qr" (UID: "b9dd00bb-c13a-4382-a816-dc9639b9e184") : secret "networking-console-plugin-cert" not found Apr 22 19:08:47.804832 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:47.804803 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-sf8zr"] Apr 22 19:08:47.807001 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:47.806986 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sf8zr" Apr 22 19:08:47.811018 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:47.810999 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 19:08:47.811137 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:47.810999 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-mbtg9\"" Apr 22 19:08:47.811137 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:47.811053 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 19:08:47.817046 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:47.817026 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-sf8zr"] Apr 22 19:08:47.886377 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:47.886351 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk5jp\" (UniqueName: \"kubernetes.io/projected/17159697-aff9-46c8-9fbc-9155d0485df2-kube-api-access-xk5jp\") pod \"migrator-74bb7799d9-sf8zr\" (UID: \"17159697-aff9-46c8-9fbc-9155d0485df2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sf8zr" Apr 22 19:08:47.987005 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:47.986973 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xk5jp\" (UniqueName: \"kubernetes.io/projected/17159697-aff9-46c8-9fbc-9155d0485df2-kube-api-access-xk5jp\") pod \"migrator-74bb7799d9-sf8zr\" (UID: \"17159697-aff9-46c8-9fbc-9155d0485df2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sf8zr" Apr 22 19:08:47.995757 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:47.995737 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk5jp\" (UniqueName: \"kubernetes.io/projected/17159697-aff9-46c8-9fbc-9155d0485df2-kube-api-access-xk5jp\") pod \"migrator-74bb7799d9-sf8zr\" (UID: \"17159697-aff9-46c8-9fbc-9155d0485df2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sf8zr" Apr 22 19:08:48.115548 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:48.115491 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sf8zr" Apr 22 19:08:48.225441 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:48.225409 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-sf8zr"] Apr 22 19:08:48.228039 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:08:48.228008 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17159697_aff9_46c8_9fbc_9155d0485df2.slice/crio-36d9a420ba3d21a5395bd902529887915b1debd699ff9ecac34b03ed29e361b5 WatchSource:0}: Error finding container 36d9a420ba3d21a5395bd902529887915b1debd699ff9ecac34b03ed29e361b5: Status 404 returned error can't find the container with id 36d9a420ba3d21a5395bd902529887915b1debd699ff9ecac34b03ed29e361b5 Apr 22 19:08:48.852761 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:48.852730 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sf8zr" event={"ID":"17159697-aff9-46c8-9fbc-9155d0485df2","Type":"ContainerStarted","Data":"36d9a420ba3d21a5395bd902529887915b1debd699ff9ecac34b03ed29e361b5"} Apr 22 19:08:49.333080 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:49.333050 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jhwl6_c470388d-c98e-482f-9c89-a240f3abac2d/dns-node-resolver/0.log" Apr 22 19:08:49.855995 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:49.855964 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sf8zr" event={"ID":"17159697-aff9-46c8-9fbc-9155d0485df2","Type":"ContainerStarted","Data":"c7078fa516916a4bb8ec4b8e9dbce4b06646ac739e563aab712aab266f7f1d69"} Apr 22 19:08:49.855995 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:49.855995 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sf8zr" event={"ID":"17159697-aff9-46c8-9fbc-9155d0485df2","Type":"ContainerStarted","Data":"b069a7cd6c95c2346fd8f41c870db6220726887826c5803c9f10d5e597599434"} Apr 22 19:08:49.872124 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:49.872077 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-sf8zr" podStartSLOduration=2.076594605 podStartE2EDuration="2.872065151s" podCreationTimestamp="2026-04-22 19:08:47 +0000 UTC" firstStartedPulling="2026-04-22 19:08:48.229836967 +0000 UTC m=+159.277593152" lastFinishedPulling="2026-04-22 19:08:49.025307509 +0000 UTC m=+160.073063698" observedRunningTime="2026-04-22 19:08:49.871264414 +0000 UTC m=+160.919020633" watchObservedRunningTime="2026-04-22 19:08:49.872065151 +0000 UTC m=+160.919821387" Apr 22 19:08:49.900015 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:49.899990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:49.900113 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:49.900025 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rjggh\" (UID: \"1d66a3f8-13ef-4828-8abb-f0982f4e7a97\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh" Apr 22 19:08:49.900173 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:49.900129 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:08:49.900173 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:49.900151 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:49.900267 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:49.900184 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-samples-operator-tls podName:1d66a3f8-13ef-4828-8abb-f0982f4e7a97 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:57.90016701 +0000 UTC m=+168.947923209 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-rjggh" (UID: "1d66a3f8-13ef-4828-8abb-f0982f4e7a97") : secret "samples-operator-tls" not found Apr 22 19:08:49.900267 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:49.900128 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:08:49.900267 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:49.900225 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle podName:763efc39-e846-49c8-8d1f-df055b7efff8 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:57.900211034 +0000 UTC m=+168.947967235 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle") pod "router-default-fc6bc5684-m58jn" (UID: "763efc39-e846-49c8-8d1f-df055b7efff8") : configmap references non-existent config key: service-ca.crt Apr 22 19:08:49.900267 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:49.900245 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-s8877\" (UID: \"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" Apr 22 19:08:49.900267 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:49.900256 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs podName:763efc39-e846-49c8-8d1f-df055b7efff8 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:57.900241484 +0000 UTC m=+168.947997672 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs") pod "router-default-fc6bc5684-m58jn" (UID: "763efc39-e846-49c8-8d1f-df055b7efff8") : secret "router-metrics-certs-default" not found Apr 22 19:08:49.900454 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:49.900297 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:08:49.900454 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:49.900323 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls podName:9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:57.900314137 +0000 UTC m=+168.948070321 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-s8877" (UID: "9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:08:50.101005 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.100983 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sx5qr\" (UID: \"b9dd00bb-c13a-4382-a816-dc9639b9e184\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr" Apr 22 19:08:50.101090 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:50.101057 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:08:50.101129 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:50.101093 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert podName:b9dd00bb-c13a-4382-a816-dc9639b9e184 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:58.101083459 +0000 UTC m=+169.148839644 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-sx5qr" (UID: "b9dd00bb-c13a-4382-a816-dc9639b9e184") : secret "networking-console-plugin-cert" not found Apr 22 19:08:50.302802 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.302772 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls\") pod \"image-registry-5b5687b7fb-dblvq\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:08:50.302936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.302822 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:08:50.302936 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:50.302925 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:08:50.303020 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:50.302938 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b5687b7fb-dblvq: secret "image-registry-tls" not found Apr 22 19:08:50.303020 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:50.302973 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:08:50.303020 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:50.302995 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls podName:e92d5fad-0e3c-4252-8c7e-880f065232c1 nodeName:}" failed. No retries permitted until 2026-04-22 19:10:52.302981233 +0000 UTC m=+283.350737423 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls") pod "image-registry-5b5687b7fb-dblvq" (UID: "e92d5fad-0e3c-4252-8c7e-880f065232c1") : secret "image-registry-tls" not found Apr 22 19:08:50.303020 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:50.303016 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls podName:20d45b6d-4197-46bf-bb48-01fcb92e75d7 nodeName:}" failed. No retries permitted until 2026-04-22 19:10:52.303005404 +0000 UTC m=+283.350761588 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls") pod "dns-default-s5lsg" (UID: "20d45b6d-4197-46bf-bb48-01fcb92e75d7") : secret "dns-default-metrics-tls" not found Apr 22 19:08:50.334001 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.333981 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tbt7s_6756b0b3-8e30-47c8-925b-478ee2126fcc/node-ca/0.log" Apr 22 19:08:50.395535 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.395501 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xkv6f"] Apr 22 19:08:50.397464 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.397451 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-xkv6f" Apr 22 19:08:50.400163 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.400145 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 19:08:50.400247 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.400202 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 19:08:50.400310 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.400263 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 19:08:50.400310 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.400263 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-tmm4r\"" Apr 22 19:08:50.400496 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.400480 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 19:08:50.403270 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.403249 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert\") pod \"ingress-canary-4mp4v\" (UID: \"f941c8a3-428c-47d0-a796-fd116d4256dc\") " pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:08:50.403367 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:50.403356 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:08:50.403431 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:50.403403 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert podName:f941c8a3-428c-47d0-a796-fd116d4256dc nodeName:}" failed. No retries permitted until 2026-04-22 19:10:52.403384222 +0000 UTC m=+283.451140407 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert") pod "ingress-canary-4mp4v" (UID: "f941c8a3-428c-47d0-a796-fd116d4256dc") : secret "canary-serving-cert" not found Apr 22 19:08:50.410585 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.410564 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xkv6f"] Apr 22 19:08:50.504363 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.504339 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwj79\" (UniqueName: \"kubernetes.io/projected/fe3b8e27-7ac1-404c-8330-12fb1819a90e-kube-api-access-lwj79\") pod \"service-ca-865cb79987-xkv6f\" (UID: \"fe3b8e27-7ac1-404c-8330-12fb1819a90e\") " pod="openshift-service-ca/service-ca-865cb79987-xkv6f" Apr 22 19:08:50.504452 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.504375 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fe3b8e27-7ac1-404c-8330-12fb1819a90e-signing-key\") pod \"service-ca-865cb79987-xkv6f\" (UID: \"fe3b8e27-7ac1-404c-8330-12fb1819a90e\") " pod="openshift-service-ca/service-ca-865cb79987-xkv6f" Apr 22 19:08:50.504500 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.504486 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fe3b8e27-7ac1-404c-8330-12fb1819a90e-signing-cabundle\") pod \"service-ca-865cb79987-xkv6f\" (UID: \"fe3b8e27-7ac1-404c-8330-12fb1819a90e\") " pod="openshift-service-ca/service-ca-865cb79987-xkv6f" Apr 22 19:08:50.605638 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.605585 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fe3b8e27-7ac1-404c-8330-12fb1819a90e-signing-cabundle\") pod \"service-ca-865cb79987-xkv6f\" (UID: \"fe3b8e27-7ac1-404c-8330-12fb1819a90e\") " pod="openshift-service-ca/service-ca-865cb79987-xkv6f" Apr 22 19:08:50.605638 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.605633 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwj79\" (UniqueName: \"kubernetes.io/projected/fe3b8e27-7ac1-404c-8330-12fb1819a90e-kube-api-access-lwj79\") pod \"service-ca-865cb79987-xkv6f\" (UID: \"fe3b8e27-7ac1-404c-8330-12fb1819a90e\") " pod="openshift-service-ca/service-ca-865cb79987-xkv6f" Apr 22 19:08:50.605755 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.605739 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fe3b8e27-7ac1-404c-8330-12fb1819a90e-signing-key\") pod \"service-ca-865cb79987-xkv6f\" (UID: \"fe3b8e27-7ac1-404c-8330-12fb1819a90e\") " pod="openshift-service-ca/service-ca-865cb79987-xkv6f" Apr 22 19:08:50.606142 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.606122 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fe3b8e27-7ac1-404c-8330-12fb1819a90e-signing-cabundle\") pod \"service-ca-865cb79987-xkv6f\" (UID: \"fe3b8e27-7ac1-404c-8330-12fb1819a90e\") " pod="openshift-service-ca/service-ca-865cb79987-xkv6f" Apr 22 19:08:50.608009 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.607989 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fe3b8e27-7ac1-404c-8330-12fb1819a90e-signing-key\") pod \"service-ca-865cb79987-xkv6f\" (UID: \"fe3b8e27-7ac1-404c-8330-12fb1819a90e\") " pod="openshift-service-ca/service-ca-865cb79987-xkv6f" Apr 22 19:08:50.614305 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.614283 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwj79\" (UniqueName: \"kubernetes.io/projected/fe3b8e27-7ac1-404c-8330-12fb1819a90e-kube-api-access-lwj79\") pod \"service-ca-865cb79987-xkv6f\" (UID: \"fe3b8e27-7ac1-404c-8330-12fb1819a90e\") " pod="openshift-service-ca/service-ca-865cb79987-xkv6f" Apr 22 19:08:50.705905 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.705878 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-xkv6f" Apr 22 19:08:50.816206 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.816181 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xkv6f"] Apr 22 19:08:50.819819 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:08:50.819794 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe3b8e27_7ac1_404c_8330_12fb1819a90e.slice/crio-dfce0a8f4ddb9299e52785784cc7aaad1887984306061a7c5186e4c34fd7f77d WatchSource:0}: Error finding container dfce0a8f4ddb9299e52785784cc7aaad1887984306061a7c5186e4c34fd7f77d: Status 404 returned error can't find the container with id dfce0a8f4ddb9299e52785784cc7aaad1887984306061a7c5186e4c34fd7f77d Apr 22 19:08:50.859201 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:50.859139 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-xkv6f" event={"ID":"fe3b8e27-7ac1-404c-8330-12fb1819a90e","Type":"ContainerStarted","Data":"dfce0a8f4ddb9299e52785784cc7aaad1887984306061a7c5186e4c34fd7f77d"} Apr 22 19:08:52.866205 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:52.866134 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-xkv6f" event={"ID":"fe3b8e27-7ac1-404c-8330-12fb1819a90e","Type":"ContainerStarted","Data":"bba45a79e95f5b48b03faeef8739009d50cb488520fad8be17097e02d7eb5e11"} Apr 22 19:08:52.882477 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:52.882434 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-xkv6f" podStartSLOduration=1.213156951 podStartE2EDuration="2.88242229s" podCreationTimestamp="2026-04-22 19:08:50 +0000 UTC" firstStartedPulling="2026-04-22 19:08:50.821519943 +0000 UTC m=+161.869276143" lastFinishedPulling="2026-04-22 19:08:52.490785293 +0000 UTC m=+163.538541482" observedRunningTime="2026-04-22 19:08:52.882108213 +0000 UTC m=+163.929864419" watchObservedRunningTime="2026-04-22 19:08:52.88242229 +0000 UTC m=+163.930178497" Apr 22 19:08:57.963212 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:57.963136 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:57.963212 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:57.963180 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-s8877\" (UID: \"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" Apr 22 19:08:57.963667 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:57.963271 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:08:57.963667 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:57.963298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:08:57.963667 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:57.963320 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls podName:9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4 nodeName:}" failed. No retries permitted until 2026-04-22 19:09:13.963306638 +0000 UTC m=+185.011062822 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-s8877" (UID: "9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:08:57.963667 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:57.963361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rjggh\" (UID: \"1d66a3f8-13ef-4828-8abb-f0982f4e7a97\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh" Apr 22 19:08:57.963667 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:57.963370 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:08:57.963667 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:57.963408 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle podName:763efc39-e846-49c8-8d1f-df055b7efff8 nodeName:}" failed. No retries permitted until 2026-04-22 19:09:13.96339529 +0000 UTC m=+185.011151482 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle") pod "router-default-fc6bc5684-m58jn" (UID: "763efc39-e846-49c8-8d1f-df055b7efff8") : configmap references non-existent config key: service-ca.crt Apr 22 19:08:57.963667 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:57.963438 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs podName:763efc39-e846-49c8-8d1f-df055b7efff8 nodeName:}" failed. No retries permitted until 2026-04-22 19:09:13.963426878 +0000 UTC m=+185.011183067 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs") pod "router-default-fc6bc5684-m58jn" (UID: "763efc39-e846-49c8-8d1f-df055b7efff8") : secret "router-metrics-certs-default" not found Apr 22 19:08:57.965546 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:57.965530 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1d66a3f8-13ef-4828-8abb-f0982f4e7a97-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-rjggh\" (UID: \"1d66a3f8-13ef-4828-8abb-f0982f4e7a97\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh" Apr 22 19:08:58.083464 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:58.083442 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh" Apr 22 19:08:58.165162 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:58.165130 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sx5qr\" (UID: \"b9dd00bb-c13a-4382-a816-dc9639b9e184\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr" Apr 22 19:08:58.165288 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:58.165266 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:08:58.165350 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:08:58.165345 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert podName:b9dd00bb-c13a-4382-a816-dc9639b9e184 nodeName:}" failed. No retries permitted until 2026-04-22 19:09:14.165323357 +0000 UTC m=+185.213079555 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-sx5qr" (UID: "b9dd00bb-c13a-4382-a816-dc9639b9e184") : secret "networking-console-plugin-cert" not found Apr 22 19:08:58.197153 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:58.197133 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh"] Apr 22 19:08:58.882571 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:58.882531 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh" event={"ID":"1d66a3f8-13ef-4828-8abb-f0982f4e7a97","Type":"ContainerStarted","Data":"67b72c90a3d05e10006e65c9135028411895775caeb20fdd850f3ff86261dc7f"} Apr 22 19:08:59.457456 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:08:59.457415 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:09:00.888929 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:00.888851 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh" event={"ID":"1d66a3f8-13ef-4828-8abb-f0982f4e7a97","Type":"ContainerStarted","Data":"17eed5d0b7fdc3afacdea7d997d5dde4f65b5c967ac69b73f75701a00dfbd62a"} Apr 22 19:09:00.888929 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:00.888885 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh" event={"ID":"1d66a3f8-13ef-4828-8abb-f0982f4e7a97","Type":"ContainerStarted","Data":"2e850b36920e10430507be7de53be32a654f5d59b3e63e10ec0d5fb8e2319072"} Apr 22 19:09:00.915537 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:00.915480 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-rjggh" podStartSLOduration=16.651472285 podStartE2EDuration="18.915467477s" podCreationTimestamp="2026-04-22 19:08:42 +0000 UTC" firstStartedPulling="2026-04-22 19:08:58.235142276 +0000 UTC m=+169.282898459" lastFinishedPulling="2026-04-22 19:09:00.499137467 +0000 UTC m=+171.546893651" observedRunningTime="2026-04-22 19:09:00.914422909 +0000 UTC m=+171.962179126" watchObservedRunningTime="2026-04-22 19:09:00.915467477 +0000 UTC m=+171.963223682" Apr 22 19:09:09.099466 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.099420 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rmvq2"] Apr 22 19:09:09.105762 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.105738 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rmvq2" Apr 22 19:09:09.109664 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.109639 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:09:09.109664 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.109676 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:09:09.109861 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.109678 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:09:09.109861 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.109727 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vg9n6\"" Apr 22 19:09:09.109861 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.109824 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:09:09.120409 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.120390 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rmvq2"] Apr 22 19:09:09.147282 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.147262 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5-data-volume\") pod \"insights-runtime-extractor-rmvq2\" (UID: \"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5\") " pod="openshift-insights/insights-runtime-extractor-rmvq2" Apr 22 19:09:09.147380 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.147302 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5-crio-socket\") pod \"insights-runtime-extractor-rmvq2\" (UID: \"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5\") " pod="openshift-insights/insights-runtime-extractor-rmvq2" Apr 22 19:09:09.147380 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.147349 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rmvq2\" (UID: \"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5\") " pod="openshift-insights/insights-runtime-extractor-rmvq2" Apr 22 19:09:09.147457 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.147401 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n7ph\" (UniqueName: \"kubernetes.io/projected/44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5-kube-api-access-8n7ph\") pod \"insights-runtime-extractor-rmvq2\" (UID: \"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5\") " pod="openshift-insights/insights-runtime-extractor-rmvq2" Apr 22 19:09:09.147457 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.147444 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rmvq2\" (UID: \"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5\") " pod="openshift-insights/insights-runtime-extractor-rmvq2" Apr 22 19:09:09.248748 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.248726 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rmvq2\" (UID: \"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5\") " pod="openshift-insights/insights-runtime-extractor-rmvq2" Apr 22 19:09:09.248877 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.248796 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8n7ph\" (UniqueName: \"kubernetes.io/projected/44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5-kube-api-access-8n7ph\") pod \"insights-runtime-extractor-rmvq2\" (UID: \"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5\") " pod="openshift-insights/insights-runtime-extractor-rmvq2" Apr 22 19:09:09.248877 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.248864 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rmvq2\" (UID: \"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5\") " pod="openshift-insights/insights-runtime-extractor-rmvq2" Apr 22 19:09:09.248997 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.248902 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5-data-volume\") pod \"insights-runtime-extractor-rmvq2\" (UID: \"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5\") " pod="openshift-insights/insights-runtime-extractor-rmvq2" Apr 22 19:09:09.248997 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.248939 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5-crio-socket\") pod \"insights-runtime-extractor-rmvq2\" (UID: \"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5\") " pod="openshift-insights/insights-runtime-extractor-rmvq2" Apr 22 19:09:09.249081 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.249070 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5-crio-socket\") pod \"insights-runtime-extractor-rmvq2\" (UID: \"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5\") " pod="openshift-insights/insights-runtime-extractor-rmvq2" Apr 22 19:09:09.249207 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.249191 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5-data-volume\") pod \"insights-runtime-extractor-rmvq2\" (UID: \"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5\") " pod="openshift-insights/insights-runtime-extractor-rmvq2" Apr 22 19:09:09.249322 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.249306 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rmvq2\" (UID: \"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5\") " pod="openshift-insights/insights-runtime-extractor-rmvq2" Apr 22 19:09:09.251045 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.251028 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rmvq2\" (UID: \"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5\") " pod="openshift-insights/insights-runtime-extractor-rmvq2" Apr 22 19:09:09.259466 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.259440 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n7ph\" (UniqueName: \"kubernetes.io/projected/44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5-kube-api-access-8n7ph\") pod \"insights-runtime-extractor-rmvq2\" (UID: \"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5\") " pod="openshift-insights/insights-runtime-extractor-rmvq2" Apr 22 19:09:09.416642 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.416585 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vg9n6\"" Apr 22 19:09:09.424905 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.424882 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rmvq2" Apr 22 19:09:09.546089 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.546067 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rmvq2"] Apr 22 19:09:09.548361 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:09:09.548332 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44f2c1d7_31d3_4edc_83b7_d8b759b0e4d5.slice/crio-c719fb27ec23aa3f7d8b8c69d63b8491a23790369c157f875718b056d73a25d1 WatchSource:0}: Error finding container c719fb27ec23aa3f7d8b8c69d63b8491a23790369c157f875718b056d73a25d1: Status 404 returned error can't find the container with id c719fb27ec23aa3f7d8b8c69d63b8491a23790369c157f875718b056d73a25d1 Apr 22 19:09:09.912388 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.912356 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rmvq2" event={"ID":"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5","Type":"ContainerStarted","Data":"a790083c38d0db7392a6cc09429513fa3deba823722b1435aa571fc979452656"} Apr 22 19:09:09.912388 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:09.912391 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rmvq2" event={"ID":"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5","Type":"ContainerStarted","Data":"c719fb27ec23aa3f7d8b8c69d63b8491a23790369c157f875718b056d73a25d1"} Apr 22 19:09:10.916454 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:10.916420 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rmvq2" event={"ID":"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5","Type":"ContainerStarted","Data":"047c078473723a9ac1251d393ce7fc942a32a6c02244e11d560d079ead95bf97"} Apr 22 19:09:12.922289 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:12.922255 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rmvq2" event={"ID":"44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5","Type":"ContainerStarted","Data":"da25d3bb12583c93175aff885034809d5c0dc92f365a596eb08d8ed4d85100fe"} Apr 22 19:09:12.942370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:12.942322 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rmvq2" podStartSLOduration=1.675669554 podStartE2EDuration="3.942309693s" podCreationTimestamp="2026-04-22 19:09:09 +0000 UTC" firstStartedPulling="2026-04-22 19:09:09.605260458 +0000 UTC m=+180.653016642" lastFinishedPulling="2026-04-22 19:09:11.871900594 +0000 UTC m=+182.919656781" observedRunningTime="2026-04-22 19:09:12.940996254 +0000 UTC m=+183.988752460" watchObservedRunningTime="2026-04-22 19:09:12.942309693 +0000 UTC m=+183.990065926" Apr 22 19:09:13.988275 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:13.988242 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:09:13.988714 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:13.988285 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:09:13.988714 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:13.988344 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-s8877\" (UID: \"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" Apr 22 19:09:13.988936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:13.988911 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/763efc39-e846-49c8-8d1f-df055b7efff8-service-ca-bundle\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:09:13.990815 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:13.990793 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-s8877\" (UID: \"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" Apr 22 19:09:13.990912 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:13.990874 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/763efc39-e846-49c8-8d1f-df055b7efff8-metrics-certs\") pod \"router-default-fc6bc5684-m58jn\" (UID: \"763efc39-e846-49c8-8d1f-df055b7efff8\") " pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:09:13.999713 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:13.999695 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-wkk4m\"" Apr 22 19:09:14.007816 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:14.007803 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:09:14.141208 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:14.141081 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-fc6bc5684-m58jn"] Apr 22 19:09:14.143558 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:09:14.143503 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod763efc39_e846_49c8_8d1f_df055b7efff8.slice/crio-9385b92be3d35a7d11c1ff2b10bbc9a25d7536bfbaf7994fdc7c7a1295f739a2 WatchSource:0}: Error finding container 9385b92be3d35a7d11c1ff2b10bbc9a25d7536bfbaf7994fdc7c7a1295f739a2: Status 404 returned error can't find the container with id 9385b92be3d35a7d11c1ff2b10bbc9a25d7536bfbaf7994fdc7c7a1295f739a2 Apr 22 19:09:14.189643 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:14.189621 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sx5qr\" (UID: \"b9dd00bb-c13a-4382-a816-dc9639b9e184\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr" Apr 22 19:09:14.191613 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:14.191588 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b9dd00bb-c13a-4382-a816-dc9639b9e184-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sx5qr\" (UID: \"b9dd00bb-c13a-4382-a816-dc9639b9e184\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr" Apr 22 19:09:14.292269 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:14.292201 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-2xntp\"" Apr 22 19:09:14.301097 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:14.301078 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" Apr 22 19:09:14.412257 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:14.412153 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877"] Apr 22 19:09:14.414647 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:09:14.414622 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d9eb13c_7ebf_45ab_8aa0_ee5b0aaf46d4.slice/crio-ac4ee3efab189cf1550e37eed2a75b81b50ba522e6b4982428a21ca0c53225e2 WatchSource:0}: Error finding container ac4ee3efab189cf1550e37eed2a75b81b50ba522e6b4982428a21ca0c53225e2: Status 404 returned error can't find the container with id ac4ee3efab189cf1550e37eed2a75b81b50ba522e6b4982428a21ca0c53225e2 Apr 22 19:09:14.482974 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:14.482953 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-rz64w\"" Apr 22 19:09:14.490383 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:14.490369 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr" Apr 22 19:09:14.611135 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:14.611110 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr"] Apr 22 19:09:14.613728 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:09:14.613705 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9dd00bb_c13a_4382_a816_dc9639b9e184.slice/crio-1bd44b6d923646b577eceff8e013a2833e282ce2f95759595fba0723fa4fdc25 WatchSource:0}: Error finding container 1bd44b6d923646b577eceff8e013a2833e282ce2f95759595fba0723fa4fdc25: Status 404 returned error can't find the container with id 1bd44b6d923646b577eceff8e013a2833e282ce2f95759595fba0723fa4fdc25 Apr 22 19:09:14.927731 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:14.927646 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" event={"ID":"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4","Type":"ContainerStarted","Data":"ac4ee3efab189cf1550e37eed2a75b81b50ba522e6b4982428a21ca0c53225e2"} Apr 22 19:09:14.928563 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:14.928539 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr" event={"ID":"b9dd00bb-c13a-4382-a816-dc9639b9e184","Type":"ContainerStarted","Data":"1bd44b6d923646b577eceff8e013a2833e282ce2f95759595fba0723fa4fdc25"} Apr 22 19:09:14.929570 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:14.929552 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-fc6bc5684-m58jn" event={"ID":"763efc39-e846-49c8-8d1f-df055b7efff8","Type":"ContainerStarted","Data":"7d7b5e0b1288f7f632a4d32c80a490ffc8cf7e7b620ced05fee06166dce5b5a2"} Apr 22 19:09:14.929656 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:14.929573 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-fc6bc5684-m58jn" event={"ID":"763efc39-e846-49c8-8d1f-df055b7efff8","Type":"ContainerStarted","Data":"9385b92be3d35a7d11c1ff2b10bbc9a25d7536bfbaf7994fdc7c7a1295f739a2"} Apr 22 19:09:14.951286 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:14.951250 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-fc6bc5684-m58jn" podStartSLOduration=32.951240023 podStartE2EDuration="32.951240023s" podCreationTimestamp="2026-04-22 19:08:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:09:14.950984482 +0000 UTC m=+185.998740689" watchObservedRunningTime="2026-04-22 19:09:14.951240023 +0000 UTC m=+185.998996229" Apr 22 19:09:15.008342 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:15.008322 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:09:15.010619 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:15.010602 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:09:15.933950 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:15.933910 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr" event={"ID":"b9dd00bb-c13a-4382-a816-dc9639b9e184","Type":"ContainerStarted","Data":"338529f377d93bc6d54759ae3eabb11a5c0197db776bb1bef9ad65f6eca98bd7"} Apr 22 19:09:15.934465 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:15.934445 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:09:15.935645 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:15.935620 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-fc6bc5684-m58jn" Apr 22 19:09:15.952208 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:15.952164 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sx5qr" podStartSLOduration=32.798831053 podStartE2EDuration="33.952152101s" podCreationTimestamp="2026-04-22 19:08:42 +0000 UTC" firstStartedPulling="2026-04-22 19:09:14.615427059 +0000 UTC m=+185.663183243" lastFinishedPulling="2026-04-22 19:09:15.768748089 +0000 UTC m=+186.816504291" observedRunningTime="2026-04-22 19:09:15.951283578 +0000 UTC m=+186.999039796" watchObservedRunningTime="2026-04-22 19:09:15.952152101 +0000 UTC m=+186.999908306" Apr 22 19:09:16.937386 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:16.937304 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" event={"ID":"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4","Type":"ContainerStarted","Data":"6d3e8ae5132908bd0304ec7d2fba432b58007ead8bb921bbb7c77d8329d7de0e"} Apr 22 19:09:16.960796 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:16.960750 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" podStartSLOduration=32.886118249 podStartE2EDuration="34.96073706s" podCreationTimestamp="2026-04-22 19:08:42 +0000 UTC" firstStartedPulling="2026-04-22 19:09:14.41648643 +0000 UTC m=+185.464242613" lastFinishedPulling="2026-04-22 19:09:16.49110524 +0000 UTC m=+187.538861424" observedRunningTime="2026-04-22 19:09:16.96070765 +0000 UTC m=+188.008463869" watchObservedRunningTime="2026-04-22 19:09:16.96073706 +0000 UTC m=+188.008493260" Apr 22 19:09:24.957791 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:24.957761 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-s8877_9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4/cluster-monitoring-operator/0.log" Apr 22 19:09:24.958146 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:24.957802 2572 generic.go:358] "Generic (PLEG): container finished" podID="9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4" containerID="6d3e8ae5132908bd0304ec7d2fba432b58007ead8bb921bbb7c77d8329d7de0e" exitCode=2 Apr 22 19:09:24.958146 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:24.957834 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" event={"ID":"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4","Type":"ContainerDied","Data":"6d3e8ae5132908bd0304ec7d2fba432b58007ead8bb921bbb7c77d8329d7de0e"} Apr 22 19:09:24.958146 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:24.958124 2572 scope.go:117] "RemoveContainer" containerID="6d3e8ae5132908bd0304ec7d2fba432b58007ead8bb921bbb7c77d8329d7de0e" Apr 22 19:09:25.962215 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:25.962188 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-s8877_9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4/cluster-monitoring-operator/0.log" Apr 22 19:09:25.962608 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:25.962280 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-s8877" event={"ID":"9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4","Type":"ContainerStarted","Data":"06b016c3733940256dfb7c6f073cf34967fdc6e674e47df6df18012e40bae6f2"} Apr 22 19:09:28.675327 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.675290 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-w9kv6"] Apr 22 19:09:28.678241 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.678219 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.679502 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.679469 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9g586"] Apr 22 19:09:28.680774 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.680755 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 19:09:28.681737 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.681714 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-xxsb2\"" Apr 22 19:09:28.681829 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.681748 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:09:28.681893 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.681834 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 19:09:28.681893 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.681844 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 19:09:28.682639 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.682623 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.684745 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.684727 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:09:28.684745 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.684738 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:09:28.684890 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.684807 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:09:28.684890 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.684813 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mncrp\"" Apr 22 19:09:28.693670 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.693649 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-w9kv6"] Apr 22 19:09:28.793875 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.793851 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1e090539-33ef-4cd3-9340-65a4afb737cd-node-exporter-accelerators-collector-config\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.794002 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.793889 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8d77a754-770f-4df8-a80a-4341eb422dea-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-w9kv6\" (UID: \"8d77a754-770f-4df8-a80a-4341eb422dea\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.794002 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.793917 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1e090539-33ef-4cd3-9340-65a4afb737cd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.794002 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.793972 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d77a754-770f-4df8-a80a-4341eb422dea-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-w9kv6\" (UID: \"8d77a754-770f-4df8-a80a-4341eb422dea\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.794121 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.794021 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1e090539-33ef-4cd3-9340-65a4afb737cd-node-exporter-tls\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.794121 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.794054 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phfjq\" (UniqueName: \"kubernetes.io/projected/1e090539-33ef-4cd3-9340-65a4afb737cd-kube-api-access-phfjq\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.794121 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.794097 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8d77a754-770f-4df8-a80a-4341eb422dea-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-w9kv6\" (UID: \"8d77a754-770f-4df8-a80a-4341eb422dea\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.794245 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.794127 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1e090539-33ef-4cd3-9340-65a4afb737cd-node-exporter-textfile\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.794245 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.794160 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d77a754-770f-4df8-a80a-4341eb422dea-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-w9kv6\" (UID: \"8d77a754-770f-4df8-a80a-4341eb422dea\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.794245 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.794185 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e090539-33ef-4cd3-9340-65a4afb737cd-metrics-client-ca\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.794245 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.794224 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bgh8\" (UniqueName: \"kubernetes.io/projected/8d77a754-770f-4df8-a80a-4341eb422dea-kube-api-access-6bgh8\") pod \"kube-state-metrics-69db897b98-w9kv6\" (UID: \"8d77a754-770f-4df8-a80a-4341eb422dea\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.794379 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.794250 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1e090539-33ef-4cd3-9340-65a4afb737cd-root\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.794379 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.794277 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1e090539-33ef-4cd3-9340-65a4afb737cd-node-exporter-wtmp\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.794379 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.794339 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1e090539-33ef-4cd3-9340-65a4afb737cd-sys\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.794379 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.794366 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d77a754-770f-4df8-a80a-4341eb422dea-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-w9kv6\" (UID: \"8d77a754-770f-4df8-a80a-4341eb422dea\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.895565 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.895537 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1e090539-33ef-4cd3-9340-65a4afb737cd-node-exporter-textfile\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.895684 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.895572 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d77a754-770f-4df8-a80a-4341eb422dea-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-w9kv6\" (UID: \"8d77a754-770f-4df8-a80a-4341eb422dea\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.895684 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.895589 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e090539-33ef-4cd3-9340-65a4afb737cd-metrics-client-ca\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.895684 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.895614 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bgh8\" (UniqueName: \"kubernetes.io/projected/8d77a754-770f-4df8-a80a-4341eb422dea-kube-api-access-6bgh8\") pod \"kube-state-metrics-69db897b98-w9kv6\" (UID: \"8d77a754-770f-4df8-a80a-4341eb422dea\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.895843 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.895684 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1e090539-33ef-4cd3-9340-65a4afb737cd-root\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.895843 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.895725 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1e090539-33ef-4cd3-9340-65a4afb737cd-node-exporter-wtmp\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.895943 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.895792 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1e090539-33ef-4cd3-9340-65a4afb737cd-sys\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.895996 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.895951 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d77a754-770f-4df8-a80a-4341eb422dea-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-w9kv6\" (UID: \"8d77a754-770f-4df8-a80a-4341eb422dea\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.895996 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.895949 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1e090539-33ef-4cd3-9340-65a4afb737cd-node-exporter-wtmp\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.895996 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.895987 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1e090539-33ef-4cd3-9340-65a4afb737cd-node-exporter-textfile\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.896150 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.896008 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1e090539-33ef-4cd3-9340-65a4afb737cd-node-exporter-accelerators-collector-config\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.896150 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.895905 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1e090539-33ef-4cd3-9340-65a4afb737cd-sys\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.896150 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.895867 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1e090539-33ef-4cd3-9340-65a4afb737cd-root\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.896150 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.896043 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8d77a754-770f-4df8-a80a-4341eb422dea-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-w9kv6\" (UID: \"8d77a754-770f-4df8-a80a-4341eb422dea\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.896150 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.896071 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1e090539-33ef-4cd3-9340-65a4afb737cd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.896150 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.896100 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d77a754-770f-4df8-a80a-4341eb422dea-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-w9kv6\" (UID: \"8d77a754-770f-4df8-a80a-4341eb422dea\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.896150 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.896127 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1e090539-33ef-4cd3-9340-65a4afb737cd-node-exporter-tls\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.896150 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.896151 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phfjq\" (UniqueName: \"kubernetes.io/projected/1e090539-33ef-4cd3-9340-65a4afb737cd-kube-api-access-phfjq\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.896544 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.896160 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e090539-33ef-4cd3-9340-65a4afb737cd-metrics-client-ca\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.896544 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.896180 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8d77a754-770f-4df8-a80a-4341eb422dea-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-w9kv6\" (UID: \"8d77a754-770f-4df8-a80a-4341eb422dea\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.896660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.896580 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d77a754-770f-4df8-a80a-4341eb422dea-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-w9kv6\" (UID: \"8d77a754-770f-4df8-a80a-4341eb422dea\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.896851 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:09:28.896834 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:09:28.897067 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.896995 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1e090539-33ef-4cd3-9340-65a4afb737cd-node-exporter-accelerators-collector-config\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.897067 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.896858 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8d77a754-770f-4df8-a80a-4341eb422dea-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-w9kv6\" (UID: \"8d77a754-770f-4df8-a80a-4341eb422dea\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.897067 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:09:28.897028 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e090539-33ef-4cd3-9340-65a4afb737cd-node-exporter-tls podName:1e090539-33ef-4cd3-9340-65a4afb737cd nodeName:}" failed. No retries permitted until 2026-04-22 19:09:29.396974664 +0000 UTC m=+200.444730860 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/1e090539-33ef-4cd3-9340-65a4afb737cd-node-exporter-tls") pod "node-exporter-9g586" (UID: "1e090539-33ef-4cd3-9340-65a4afb737cd") : secret "node-exporter-tls" not found Apr 22 19:09:28.897067 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.896847 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8d77a754-770f-4df8-a80a-4341eb422dea-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-w9kv6\" (UID: \"8d77a754-770f-4df8-a80a-4341eb422dea\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.898916 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.898887 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d77a754-770f-4df8-a80a-4341eb422dea-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-w9kv6\" (UID: \"8d77a754-770f-4df8-a80a-4341eb422dea\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.899086 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.899064 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1e090539-33ef-4cd3-9340-65a4afb737cd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.899190 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.899171 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d77a754-770f-4df8-a80a-4341eb422dea-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-w9kv6\" (UID: \"8d77a754-770f-4df8-a80a-4341eb422dea\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.904717 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.904691 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bgh8\" (UniqueName: \"kubernetes.io/projected/8d77a754-770f-4df8-a80a-4341eb422dea-kube-api-access-6bgh8\") pod \"kube-state-metrics-69db897b98-w9kv6\" (UID: \"8d77a754-770f-4df8-a80a-4341eb422dea\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:28.905272 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.905254 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phfjq\" (UniqueName: \"kubernetes.io/projected/1e090539-33ef-4cd3-9340-65a4afb737cd-kube-api-access-phfjq\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:28.991207 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:28.991178 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" Apr 22 19:09:29.113562 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:29.113505 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-w9kv6"] Apr 22 19:09:29.116659 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:09:29.116620 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d77a754_770f_4df8_a80a_4341eb422dea.slice/crio-0f24c2959582c9c58964d230028f261831f546e44a55ad43a594ddee893f113a WatchSource:0}: Error finding container 0f24c2959582c9c58964d230028f261831f546e44a55ad43a594ddee893f113a: Status 404 returned error can't find the container with id 0f24c2959582c9c58964d230028f261831f546e44a55ad43a594ddee893f113a Apr 22 19:09:29.401156 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:29.401078 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1e090539-33ef-4cd3-9340-65a4afb737cd-node-exporter-tls\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:29.403333 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:29.403317 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1e090539-33ef-4cd3-9340-65a4afb737cd-node-exporter-tls\") pod \"node-exporter-9g586\" (UID: \"1e090539-33ef-4cd3-9340-65a4afb737cd\") " pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:29.596613 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:29.596578 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9g586" Apr 22 19:09:29.606306 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:09:29.606275 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e090539_33ef_4cd3_9340_65a4afb737cd.slice/crio-5700e97c226c21199dd8eb0230ec355aaf55deb32f371dd766d2487dc724e351 WatchSource:0}: Error finding container 5700e97c226c21199dd8eb0230ec355aaf55deb32f371dd766d2487dc724e351: Status 404 returned error can't find the container with id 5700e97c226c21199dd8eb0230ec355aaf55deb32f371dd766d2487dc724e351 Apr 22 19:09:29.975814 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:29.975749 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9g586" event={"ID":"1e090539-33ef-4cd3-9340-65a4afb737cd","Type":"ContainerStarted","Data":"5700e97c226c21199dd8eb0230ec355aaf55deb32f371dd766d2487dc724e351"} Apr 22 19:09:29.977592 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:29.977544 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" event={"ID":"8d77a754-770f-4df8-a80a-4341eb422dea","Type":"ContainerStarted","Data":"0f24c2959582c9c58964d230028f261831f546e44a55ad43a594ddee893f113a"} Apr 22 19:09:30.982188 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:30.982151 2572 generic.go:358] "Generic (PLEG): container finished" podID="1e090539-33ef-4cd3-9340-65a4afb737cd" containerID="6f7c62345422f2f0f3250fe16c423ed0c2622717f9dd299b9eeab03362a4c83d" exitCode=0 Apr 22 19:09:30.982624 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:30.982242 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9g586" event={"ID":"1e090539-33ef-4cd3-9340-65a4afb737cd","Type":"ContainerDied","Data":"6f7c62345422f2f0f3250fe16c423ed0c2622717f9dd299b9eeab03362a4c83d"} Apr 22 19:09:30.984336 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:30.984308 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" event={"ID":"8d77a754-770f-4df8-a80a-4341eb422dea","Type":"ContainerStarted","Data":"335e66d02940432a77b012fbd14f617e3563f0ffbe22acad2e68796597a371c4"} Apr 22 19:09:30.984457 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:30.984345 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" event={"ID":"8d77a754-770f-4df8-a80a-4341eb422dea","Type":"ContainerStarted","Data":"83dc04ebe38e7adf9d4aec89e4adc1a1e977f508dd63aa319c22dae11e95fc42"} Apr 22 19:09:30.984457 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:30.984360 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" event={"ID":"8d77a754-770f-4df8-a80a-4341eb422dea","Type":"ContainerStarted","Data":"1163cc6420faf9798a6687128ae8cec11e01c558827e7f1c73ea58cc0a946dc4"} Apr 22 19:09:31.021567 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:31.021504 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-w9kv6" podStartSLOduration=1.6270891779999999 podStartE2EDuration="3.021490779s" podCreationTimestamp="2026-04-22 19:09:28 +0000 UTC" firstStartedPulling="2026-04-22 19:09:29.11840798 +0000 UTC m=+200.166164165" lastFinishedPulling="2026-04-22 19:09:30.512809573 +0000 UTC m=+201.560565766" observedRunningTime="2026-04-22 19:09:31.020177917 +0000 UTC m=+202.067934123" watchObservedRunningTime="2026-04-22 19:09:31.021490779 +0000 UTC m=+202.069246985" Apr 22 19:09:31.756125 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:31.756091 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5b5687b7fb-dblvq"] Apr 22 19:09:31.756278 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:09:31.756260 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" podUID="e92d5fad-0e3c-4252-8c7e-880f065232c1" Apr 22 19:09:31.991301 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:31.991275 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:09:31.991676 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:31.991302 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9g586" event={"ID":"1e090539-33ef-4cd3-9340-65a4afb737cd","Type":"ContainerStarted","Data":"21dc09a32a31cb1ddb1186d8699b52c7d8bbcda14614a35feabe930e8637cfd1"} Apr 22 19:09:31.991676 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:31.991336 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9g586" event={"ID":"1e090539-33ef-4cd3-9340-65a4afb737cd","Type":"ContainerStarted","Data":"376a9562f39b1d2b5cf3503b450f63bfdc8791bf125e21f9b28dc74e43499af6"} Apr 22 19:09:31.995596 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:31.995576 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:09:32.015582 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.015495 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9g586" podStartSLOduration=3.107029922 podStartE2EDuration="4.015476489s" podCreationTimestamp="2026-04-22 19:09:28 +0000 UTC" firstStartedPulling="2026-04-22 19:09:29.608388308 +0000 UTC m=+200.656144507" lastFinishedPulling="2026-04-22 19:09:30.516834886 +0000 UTC m=+201.564591074" observedRunningTime="2026-04-22 19:09:32.01498834 +0000 UTC m=+203.062744561" watchObservedRunningTime="2026-04-22 19:09:32.015476489 +0000 UTC m=+203.063232696" Apr 22 19:09:32.123812 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.123788 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e92d5fad-0e3c-4252-8c7e-880f065232c1-image-registry-private-configuration\") pod \"e92d5fad-0e3c-4252-8c7e-880f065232c1\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " Apr 22 19:09:32.123911 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.123833 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-bound-sa-token\") pod \"e92d5fad-0e3c-4252-8c7e-880f065232c1\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " Apr 22 19:09:32.123911 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.123857 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e92d5fad-0e3c-4252-8c7e-880f065232c1-installation-pull-secrets\") pod \"e92d5fad-0e3c-4252-8c7e-880f065232c1\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " Apr 22 19:09:32.123911 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.123875 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbblz\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-kube-api-access-bbblz\") pod \"e92d5fad-0e3c-4252-8c7e-880f065232c1\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " Apr 22 19:09:32.124025 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.124003 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-certificates\") pod \"e92d5fad-0e3c-4252-8c7e-880f065232c1\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " Apr 22 19:09:32.124075 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.124051 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e92d5fad-0e3c-4252-8c7e-880f065232c1-ca-trust-extracted\") pod \"e92d5fad-0e3c-4252-8c7e-880f065232c1\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " Apr 22 19:09:32.124128 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.124093 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e92d5fad-0e3c-4252-8c7e-880f065232c1-trusted-ca\") pod \"e92d5fad-0e3c-4252-8c7e-880f065232c1\" (UID: \"e92d5fad-0e3c-4252-8c7e-880f065232c1\") " Apr 22 19:09:32.125034 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.124339 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e92d5fad-0e3c-4252-8c7e-880f065232c1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e92d5fad-0e3c-4252-8c7e-880f065232c1" (UID: "e92d5fad-0e3c-4252-8c7e-880f065232c1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:09:32.125034 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.124356 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e92d5fad-0e3c-4252-8c7e-880f065232c1" (UID: "e92d5fad-0e3c-4252-8c7e-880f065232c1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:09:32.125034 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.124586 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e92d5fad-0e3c-4252-8c7e-880f065232c1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e92d5fad-0e3c-4252-8c7e-880f065232c1" (UID: "e92d5fad-0e3c-4252-8c7e-880f065232c1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:09:32.125034 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.124605 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-certificates\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:09:32.125034 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.124627 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e92d5fad-0e3c-4252-8c7e-880f065232c1-ca-trust-extracted\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:09:32.126257 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.126234 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e92d5fad-0e3c-4252-8c7e-880f065232c1-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e92d5fad-0e3c-4252-8c7e-880f065232c1" (UID: "e92d5fad-0e3c-4252-8c7e-880f065232c1"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:09:32.126370 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.126352 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-kube-api-access-bbblz" (OuterVolumeSpecName: "kube-api-access-bbblz") pod "e92d5fad-0e3c-4252-8c7e-880f065232c1" (UID: "e92d5fad-0e3c-4252-8c7e-880f065232c1"). InnerVolumeSpecName "kube-api-access-bbblz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:09:32.126432 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.126359 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e92d5fad-0e3c-4252-8c7e-880f065232c1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e92d5fad-0e3c-4252-8c7e-880f065232c1" (UID: "e92d5fad-0e3c-4252-8c7e-880f065232c1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:09:32.126432 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.126391 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e92d5fad-0e3c-4252-8c7e-880f065232c1" (UID: "e92d5fad-0e3c-4252-8c7e-880f065232c1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:09:32.225612 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.225591 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e92d5fad-0e3c-4252-8c7e-880f065232c1-installation-pull-secrets\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:09:32.225612 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.225614 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bbblz\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-kube-api-access-bbblz\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:09:32.225740 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.225624 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e92d5fad-0e3c-4252-8c7e-880f065232c1-trusted-ca\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:09:32.225740 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.225634 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e92d5fad-0e3c-4252-8c7e-880f065232c1-image-registry-private-configuration\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:09:32.225740 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.225644 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-bound-sa-token\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:09:32.993225 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:32.993193 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b5687b7fb-dblvq" Apr 22 19:09:33.028007 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.027979 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5b5687b7fb-dblvq"] Apr 22 19:09:33.034802 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.034773 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5b5687b7fb-dblvq"] Apr 22 19:09:33.082464 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.082439 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5bd878c756-8sl5r"] Apr 22 19:09:33.086909 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.086889 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.089778 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.089742 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 19:09:33.089778 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.089752 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 19:09:33.089935 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.089784 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 19:09:33.089935 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.089807 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 19:09:33.089935 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.089784 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-iu0509cjf6db\"" Apr 22 19:09:33.089935 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.089752 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-zmw4n\"" Apr 22 19:09:33.098536 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.097094 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5bd878c756-8sl5r"] Apr 22 19:09:33.132191 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.132153 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a2c1bf90-21ac-417c-a652-f49544683d3e-secret-metrics-server-client-certs\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.132191 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.132192 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjl5x\" (UniqueName: \"kubernetes.io/projected/a2c1bf90-21ac-417c-a652-f49544683d3e-kube-api-access-tjl5x\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.132306 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.132212 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2c1bf90-21ac-417c-a652-f49544683d3e-client-ca-bundle\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.132306 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.132279 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a2c1bf90-21ac-417c-a652-f49544683d3e-audit-log\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.132399 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.132380 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2c1bf90-21ac-417c-a652-f49544683d3e-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.132448 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.132424 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a2c1bf90-21ac-417c-a652-f49544683d3e-metrics-server-audit-profiles\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.132530 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.132496 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a2c1bf90-21ac-417c-a652-f49544683d3e-secret-metrics-server-tls\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.132606 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.132592 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e92d5fad-0e3c-4252-8c7e-880f065232c1-registry-tls\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:09:33.233828 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.233794 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a2c1bf90-21ac-417c-a652-f49544683d3e-metrics-server-audit-profiles\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.233911 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.233846 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a2c1bf90-21ac-417c-a652-f49544683d3e-secret-metrics-server-tls\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.233911 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.233892 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a2c1bf90-21ac-417c-a652-f49544683d3e-secret-metrics-server-client-certs\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.234015 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.233917 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjl5x\" (UniqueName: \"kubernetes.io/projected/a2c1bf90-21ac-417c-a652-f49544683d3e-kube-api-access-tjl5x\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.234015 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.233944 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2c1bf90-21ac-417c-a652-f49544683d3e-client-ca-bundle\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.234104 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.234061 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a2c1bf90-21ac-417c-a652-f49544683d3e-audit-log\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.234185 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.234143 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2c1bf90-21ac-417c-a652-f49544683d3e-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.234430 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.234398 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/a2c1bf90-21ac-417c-a652-f49544683d3e-audit-log\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.234721 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.234702 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2c1bf90-21ac-417c-a652-f49544683d3e-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.234878 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.234859 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/a2c1bf90-21ac-417c-a652-f49544683d3e-metrics-server-audit-profiles\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.236369 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.236348 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2c1bf90-21ac-417c-a652-f49544683d3e-client-ca-bundle\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.236833 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.236813 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/a2c1bf90-21ac-417c-a652-f49544683d3e-secret-metrics-server-client-certs\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.236833 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.236822 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/a2c1bf90-21ac-417c-a652-f49544683d3e-secret-metrics-server-tls\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.242233 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.242215 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjl5x\" (UniqueName: \"kubernetes.io/projected/a2c1bf90-21ac-417c-a652-f49544683d3e-kube-api-access-tjl5x\") pod \"metrics-server-5bd878c756-8sl5r\" (UID: \"a2c1bf90-21ac-417c-a652-f49544683d3e\") " pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.401186 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.401114 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:33.430064 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.430037 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-g69c6"] Apr 22 19:09:33.434969 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.434950 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g69c6" Apr 22 19:09:33.438939 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.438906 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 19:09:33.439041 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.438966 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-qdw4q\"" Apr 22 19:09:33.450562 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.450449 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-g69c6"] Apr 22 19:09:33.460826 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.460791 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e92d5fad-0e3c-4252-8c7e-880f065232c1" path="/var/lib/kubelet/pods/e92d5fad-0e3c-4252-8c7e-880f065232c1/volumes" Apr 22 19:09:33.531295 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.531266 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5bd878c756-8sl5r"] Apr 22 19:09:33.534826 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:09:33.534799 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2c1bf90_21ac_417c_a652_f49544683d3e.slice/crio-ef6828898b1ee6bdc8f663bb05a4e0d821f4a879614ae07ec0d40c905e97eb38 WatchSource:0}: Error finding container ef6828898b1ee6bdc8f663bb05a4e0d821f4a879614ae07ec0d40c905e97eb38: Status 404 returned error can't find the container with id ef6828898b1ee6bdc8f663bb05a4e0d821f4a879614ae07ec0d40c905e97eb38 Apr 22 19:09:33.536175 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.536152 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/58128424-86cf-418f-b63e-97891f8faf56-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-g69c6\" (UID: \"58128424-86cf-418f-b63e-97891f8faf56\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g69c6" Apr 22 19:09:33.636830 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.636802 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/58128424-86cf-418f-b63e-97891f8faf56-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-g69c6\" (UID: \"58128424-86cf-418f-b63e-97891f8faf56\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g69c6" Apr 22 19:09:33.639791 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.639752 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/58128424-86cf-418f-b63e-97891f8faf56-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-g69c6\" (UID: \"58128424-86cf-418f-b63e-97891f8faf56\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g69c6" Apr 22 19:09:33.744643 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.744617 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g69c6" Apr 22 19:09:33.868826 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.868796 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-g69c6"] Apr 22 19:09:33.872224 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:09:33.872192 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58128424_86cf_418f_b63e_97891f8faf56.slice/crio-cfee5f682b5c22216cb17d2a4aad193915fa0809fff36f56746f0cb35e95b560 WatchSource:0}: Error finding container cfee5f682b5c22216cb17d2a4aad193915fa0809fff36f56746f0cb35e95b560: Status 404 returned error can't find the container with id cfee5f682b5c22216cb17d2a4aad193915fa0809fff36f56746f0cb35e95b560 Apr 22 19:09:33.882554 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.882506 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-79794fcb7c-7xbcx"] Apr 22 19:09:33.887207 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.887193 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:33.890678 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.890654 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 19:09:33.891001 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.890985 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 19:09:33.891080 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.891062 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 19:09:33.891225 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.891209 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 19:09:33.891274 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.891255 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-dpv8j\"" Apr 22 19:09:33.891571 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.891554 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 19:09:33.897186 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.897166 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 19:09:33.899695 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.899678 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-79794fcb7c-7xbcx"] Apr 22 19:09:33.940119 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.940100 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9qvw\" (UniqueName: \"kubernetes.io/projected/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-kube-api-access-r9qvw\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:33.940215 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.940138 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-metrics-client-ca\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:33.940215 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.940160 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-secret-telemeter-client\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:33.940215 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.940177 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-federate-client-tls\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:33.940325 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.940255 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-telemeter-client-tls\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:33.940363 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.940329 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:33.940363 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.940354 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-serving-certs-ca-bundle\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:33.940432 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.940380 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:33.996998 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.996940 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" event={"ID":"a2c1bf90-21ac-417c-a652-f49544683d3e","Type":"ContainerStarted","Data":"ef6828898b1ee6bdc8f663bb05a4e0d821f4a879614ae07ec0d40c905e97eb38"} Apr 22 19:09:33.997856 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:33.997835 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g69c6" event={"ID":"58128424-86cf-418f-b63e-97891f8faf56","Type":"ContainerStarted","Data":"cfee5f682b5c22216cb17d2a4aad193915fa0809fff36f56746f0cb35e95b560"} Apr 22 19:09:34.041584 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.041560 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:34.041664 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.041589 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-serving-certs-ca-bundle\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:34.041664 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.041611 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:34.041750 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.041733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9qvw\" (UniqueName: \"kubernetes.io/projected/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-kube-api-access-r9qvw\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:34.041790 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.041779 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-metrics-client-ca\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:34.041837 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.041800 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-secret-telemeter-client\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:34.041837 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.041821 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-federate-client-tls\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:34.042196 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.042066 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-telemeter-client-tls\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:34.042360 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.042332 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-serving-certs-ca-bundle\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:34.042620 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.042594 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-metrics-client-ca\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:34.043273 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.043252 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:34.044245 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.044224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:34.044324 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.044300 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-federate-client-tls\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:34.044371 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.044325 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-secret-telemeter-client\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:34.044936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.044921 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-telemeter-client-tls\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:34.060414 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.060393 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9qvw\" (UniqueName: \"kubernetes.io/projected/adea0732-ba1b-4d0d-b6a2-49a6e98f453f-kube-api-access-r9qvw\") pod \"telemeter-client-79794fcb7c-7xbcx\" (UID: \"adea0732-ba1b-4d0d-b6a2-49a6e98f453f\") " pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:34.197641 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.197620 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" Apr 22 19:09:34.338308 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.338275 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-79794fcb7c-7xbcx"] Apr 22 19:09:34.340545 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:09:34.340499 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadea0732_ba1b_4d0d_b6a2_49a6e98f453f.slice/crio-ceb11bacc3970d4f58bb14264c9142c73951b42be801e46d2050e451c41ba224 WatchSource:0}: Error finding container ceb11bacc3970d4f58bb14264c9142c73951b42be801e46d2050e451c41ba224: Status 404 returned error can't find the container with id ceb11bacc3970d4f58bb14264c9142c73951b42be801e46d2050e451c41ba224 Apr 22 19:09:34.891291 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.891263 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:09:34.895136 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.895114 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:34.897817 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.897790 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 19:09:34.897915 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.897832 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 19:09:34.899041 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.899011 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 19:09:34.899135 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.899078 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 19:09:34.899135 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.899013 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 19:09:34.899135 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.899125 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 19:09:34.899307 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.899222 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 19:09:34.900063 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.900042 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 19:09:34.900063 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.900061 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 19:09:34.900232 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.900089 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-7dkpn\"" Apr 22 19:09:34.900340 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.900322 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 19:09:34.900400 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.900322 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 19:09:34.900449 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.900414 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-55lqld3e4lnpu\"" Apr 22 19:09:34.904014 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.903576 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 19:09:34.912181 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.912159 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:09:34.950671 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.950643 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:34.950780 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.950686 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:34.950780 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.950714 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:34.950780 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.950745 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:34.950936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.950807 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:34.950936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.950840 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:34.950936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.950891 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-config\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:34.950936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.950920 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-config-out\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:34.951133 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.950944 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:34.951133 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.950994 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:34.951133 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.951019 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg9rk\" (UniqueName: \"kubernetes.io/projected/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-kube-api-access-jg9rk\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:34.951133 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.951071 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:34.951133 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.951095 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:34.951133 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.951122 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:34.951400 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.951190 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:34.951400 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.951290 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-web-config\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:34.951400 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.951321 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:34.951400 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:34.951340 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.001859 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.001826 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" event={"ID":"adea0732-ba1b-4d0d-b6a2-49a6e98f453f","Type":"ContainerStarted","Data":"ceb11bacc3970d4f58bb14264c9142c73951b42be801e46d2050e451c41ba224"} Apr 22 19:09:35.052749 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.052720 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.052879 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.052762 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.052879 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.052832 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-web-config\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.052879 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.052859 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.053198 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.052882 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.053198 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.052935 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.053198 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.052961 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.053198 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.052987 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.053198 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.053009 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.053198 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.053031 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.053198 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.053054 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.053583 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.053566 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.053917 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.053742 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.053917 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.053780 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.053917 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.053817 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-config\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.053917 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.053855 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-config-out\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.053917 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.053886 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.053917 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.053920 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.054251 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.053943 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jg9rk\" (UniqueName: \"kubernetes.io/projected/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-kube-api-access-jg9rk\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.054251 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.053983 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.054251 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.054012 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.055792 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.055764 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.057595 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.056986 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.057595 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.057215 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.057595 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.057543 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.058118 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.058039 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-web-config\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.058617 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.058537 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.058810 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.058790 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.059374 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.059340 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.060071 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.060000 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-config-out\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.060290 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.060268 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-config\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.060290 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.060280 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.060428 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.060279 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.060428 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.060376 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.060547 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.060454 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.065009 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.064985 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg9rk\" (UniqueName: \"kubernetes.io/projected/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-kube-api-access-jg9rk\") pod \"prometheus-k8s-0\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.207419 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.207339 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:35.780995 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:35.780952 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:09:35.784445 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:09:35.784408 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e8b49c5_b5d5_4e56_a228_f15b79a38c14.slice/crio-4fdfc03c2e8071ded3b4e3597c8589a7235c23c713429d5fa8cb61a1ed60a227 WatchSource:0}: Error finding container 4fdfc03c2e8071ded3b4e3597c8589a7235c23c713429d5fa8cb61a1ed60a227: Status 404 returned error can't find the container with id 4fdfc03c2e8071ded3b4e3597c8589a7235c23c713429d5fa8cb61a1ed60a227 Apr 22 19:09:36.006584 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:36.006549 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g69c6" event={"ID":"58128424-86cf-418f-b63e-97891f8faf56","Type":"ContainerStarted","Data":"ee3b2c2af151714aa8b414f17c05fe96adc2c25ebcbb68b3e3935983aeeea95d"} Apr 22 19:09:36.007006 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:36.006934 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g69c6" Apr 22 19:09:36.008142 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:36.008119 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" event={"ID":"a2c1bf90-21ac-417c-a652-f49544683d3e","Type":"ContainerStarted","Data":"9aba506d99c3e8ff764e1cfa7f0cf3ff2346cb570c3ab8b17e763f61a5774161"} Apr 22 19:09:36.009557 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:36.009500 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1e8b49c5-b5d5-4e56-a228-f15b79a38c14","Type":"ContainerStarted","Data":"4fdfc03c2e8071ded3b4e3597c8589a7235c23c713429d5fa8cb61a1ed60a227"} Apr 22 19:09:36.012152 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:36.012126 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g69c6" Apr 22 19:09:36.024118 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:36.024066 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-g69c6" podStartSLOduration=1.257841483 podStartE2EDuration="3.024050441s" podCreationTimestamp="2026-04-22 19:09:33 +0000 UTC" firstStartedPulling="2026-04-22 19:09:33.874059239 +0000 UTC m=+204.921815423" lastFinishedPulling="2026-04-22 19:09:35.640268186 +0000 UTC m=+206.688024381" observedRunningTime="2026-04-22 19:09:36.022848591 +0000 UTC m=+207.070604798" watchObservedRunningTime="2026-04-22 19:09:36.024050441 +0000 UTC m=+207.071806651" Apr 22 19:09:36.043065 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:36.043019 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" podStartSLOduration=0.938298036 podStartE2EDuration="3.04300837s" podCreationTimestamp="2026-04-22 19:09:33 +0000 UTC" firstStartedPulling="2026-04-22 19:09:33.537007165 +0000 UTC m=+204.584763349" lastFinishedPulling="2026-04-22 19:09:35.641717482 +0000 UTC m=+206.689473683" observedRunningTime="2026-04-22 19:09:36.042437652 +0000 UTC m=+207.090193871" watchObservedRunningTime="2026-04-22 19:09:36.04300837 +0000 UTC m=+207.090764576" Apr 22 19:09:37.014290 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:37.014198 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" event={"ID":"adea0732-ba1b-4d0d-b6a2-49a6e98f453f","Type":"ContainerStarted","Data":"7e00caf1ebde9784b4900ac042eaee93d8cf012d7c3fedcc14b838266bbc66dd"} Apr 22 19:09:38.018555 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:38.018504 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" event={"ID":"adea0732-ba1b-4d0d-b6a2-49a6e98f453f","Type":"ContainerStarted","Data":"54cef565963fc171f0cc573696d5a7a614e66508ba382ed6cefebae634fdc762"} Apr 22 19:09:38.018960 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:38.018559 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" event={"ID":"adea0732-ba1b-4d0d-b6a2-49a6e98f453f","Type":"ContainerStarted","Data":"5084603f014e5638b05478cd9b741799d091ec46303be537a513072eb6403a7f"} Apr 22 19:09:38.019793 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:38.019768 2572 generic.go:358] "Generic (PLEG): container finished" podID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerID="c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb" exitCode=0 Apr 22 19:09:38.019881 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:38.019842 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1e8b49c5-b5d5-4e56-a228-f15b79a38c14","Type":"ContainerDied","Data":"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb"} Apr 22 19:09:38.043941 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:38.043900 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-79794fcb7c-7xbcx" podStartSLOduration=2.254988099 podStartE2EDuration="5.04388809s" podCreationTimestamp="2026-04-22 19:09:33 +0000 UTC" firstStartedPulling="2026-04-22 19:09:34.342369912 +0000 UTC m=+205.390126099" lastFinishedPulling="2026-04-22 19:09:37.131269902 +0000 UTC m=+208.179026090" observedRunningTime="2026-04-22 19:09:38.041306427 +0000 UTC m=+209.089062661" watchObservedRunningTime="2026-04-22 19:09:38.04388809 +0000 UTC m=+209.091644296" Apr 22 19:09:42.034648 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:42.034616 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1e8b49c5-b5d5-4e56-a228-f15b79a38c14","Type":"ContainerStarted","Data":"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b"} Apr 22 19:09:42.034648 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:42.034651 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1e8b49c5-b5d5-4e56-a228-f15b79a38c14","Type":"ContainerStarted","Data":"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10"} Apr 22 19:09:44.043087 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:44.043017 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1e8b49c5-b5d5-4e56-a228-f15b79a38c14","Type":"ContainerStarted","Data":"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66"} Apr 22 19:09:44.043087 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:44.043053 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1e8b49c5-b5d5-4e56-a228-f15b79a38c14","Type":"ContainerStarted","Data":"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9"} Apr 22 19:09:44.043087 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:44.043062 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1e8b49c5-b5d5-4e56-a228-f15b79a38c14","Type":"ContainerStarted","Data":"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964"} Apr 22 19:09:44.043087 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:44.043070 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1e8b49c5-b5d5-4e56-a228-f15b79a38c14","Type":"ContainerStarted","Data":"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4"} Apr 22 19:09:44.073676 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:44.073636 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.095127893 podStartE2EDuration="10.073623062s" podCreationTimestamp="2026-04-22 19:09:34 +0000 UTC" firstStartedPulling="2026-04-22 19:09:35.786823574 +0000 UTC m=+206.834579761" lastFinishedPulling="2026-04-22 19:09:43.765318743 +0000 UTC m=+214.813074930" observedRunningTime="2026-04-22 19:09:44.07115395 +0000 UTC m=+215.118910156" watchObservedRunningTime="2026-04-22 19:09:44.073623062 +0000 UTC m=+215.121379267" Apr 22 19:09:45.208187 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:45.208140 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:09:53.401578 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:53.401535 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:09:53.401578 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:09:53.401582 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:10:13.406963 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:13.406933 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:10:13.410707 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:13.410682 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5bd878c756-8sl5r" Apr 22 19:10:20.222569 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:20.222534 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs\") pod \"network-metrics-daemon-hvrqj\" (UID: \"627cf532-d693-4215-85b9-807d744857ce\") " pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:10:20.224787 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:20.224767 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/627cf532-d693-4215-85b9-807d744857ce-metrics-certs\") pod \"network-metrics-daemon-hvrqj\" (UID: \"627cf532-d693-4215-85b9-807d744857ce\") " pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:10:20.460718 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:20.460694 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zjzgr\"" Apr 22 19:10:20.468918 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:20.468900 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hvrqj" Apr 22 19:10:20.584303 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:20.584274 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hvrqj"] Apr 22 19:10:20.587269 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:10:20.587243 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod627cf532_d693_4215_85b9_807d744857ce.slice/crio-5aeb4b5a6d2e8177b4f4741336a73dcde9bbcf4004cec677ea321748a19b299f WatchSource:0}: Error finding container 5aeb4b5a6d2e8177b4f4741336a73dcde9bbcf4004cec677ea321748a19b299f: Status 404 returned error can't find the container with id 5aeb4b5a6d2e8177b4f4741336a73dcde9bbcf4004cec677ea321748a19b299f Apr 22 19:10:21.142172 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:21.142127 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hvrqj" event={"ID":"627cf532-d693-4215-85b9-807d744857ce","Type":"ContainerStarted","Data":"5aeb4b5a6d2e8177b4f4741336a73dcde9bbcf4004cec677ea321748a19b299f"} Apr 22 19:10:22.146380 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:22.146342 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hvrqj" event={"ID":"627cf532-d693-4215-85b9-807d744857ce","Type":"ContainerStarted","Data":"df892fe704f2edccc7d10d77b0c0839aafb784971d1fc203c3e10d80ac62550f"} Apr 22 19:10:22.146380 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:22.146378 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hvrqj" event={"ID":"627cf532-d693-4215-85b9-807d744857ce","Type":"ContainerStarted","Data":"f423bcf3fa480293f3f32c6ff6d072a37ecba6c2da853e4de1a59b59a8a314f6"} Apr 22 19:10:22.186524 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:22.186464 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hvrqj" podStartSLOduration=252.284018981 podStartE2EDuration="4m13.186449451s" podCreationTimestamp="2026-04-22 19:06:09 +0000 UTC" firstStartedPulling="2026-04-22 19:10:20.588979746 +0000 UTC m=+251.636735929" lastFinishedPulling="2026-04-22 19:10:21.491410215 +0000 UTC m=+252.539166399" observedRunningTime="2026-04-22 19:10:22.185298265 +0000 UTC m=+253.233054470" watchObservedRunningTime="2026-04-22 19:10:22.186449451 +0000 UTC m=+253.234205657" Apr 22 19:10:35.208528 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:35.208432 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:35.223758 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:35.223732 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:36.201519 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:36.201481 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:48.846231 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:10:48.846180 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-4mp4v" podUID="f941c8a3-428c-47d0-a796-fd116d4256dc" Apr 22 19:10:48.846231 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:10:48.846198 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-s5lsg" podUID="20d45b6d-4197-46bf-bb48-01fcb92e75d7" Apr 22 19:10:49.225294 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:49.225266 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:10:49.225442 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:49.225266 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s5lsg" Apr 22 19:10:52.379260 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:52.379228 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:10:52.381433 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:52.381412 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/20d45b6d-4197-46bf-bb48-01fcb92e75d7-metrics-tls\") pod \"dns-default-s5lsg\" (UID: \"20d45b6d-4197-46bf-bb48-01fcb92e75d7\") " pod="openshift-dns/dns-default-s5lsg" Apr 22 19:10:52.479801 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:52.479768 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert\") pod \"ingress-canary-4mp4v\" (UID: \"f941c8a3-428c-47d0-a796-fd116d4256dc\") " pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:10:52.482020 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:52.481990 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f941c8a3-428c-47d0-a796-fd116d4256dc-cert\") pod \"ingress-canary-4mp4v\" (UID: \"f941c8a3-428c-47d0-a796-fd116d4256dc\") " pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:10:52.528421 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:52.528398 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2cl86\"" Apr 22 19:10:52.529020 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:52.529006 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lr5f4\"" Apr 22 19:10:52.536939 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:52.536924 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s5lsg" Apr 22 19:10:52.537078 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:52.537062 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4mp4v" Apr 22 19:10:52.664466 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:52.664441 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s5lsg"] Apr 22 19:10:52.667090 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:10:52.667055 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20d45b6d_4197_46bf_bb48_01fcb92e75d7.slice/crio-39d0801d45179bc8e4d90955e70741702c93b84b2957b1afe6038806469ccbbe WatchSource:0}: Error finding container 39d0801d45179bc8e4d90955e70741702c93b84b2957b1afe6038806469ccbbe: Status 404 returned error can't find the container with id 39d0801d45179bc8e4d90955e70741702c93b84b2957b1afe6038806469ccbbe Apr 22 19:10:52.679863 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:52.679844 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4mp4v"] Apr 22 19:10:52.681892 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:10:52.681870 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf941c8a3_428c_47d0_a796_fd116d4256dc.slice/crio-de82e83b3812e3ff6bf66d97ebc35900af62bb16cdce2717516065a9c2ec565e WatchSource:0}: Error finding container de82e83b3812e3ff6bf66d97ebc35900af62bb16cdce2717516065a9c2ec565e: Status 404 returned error can't find the container with id de82e83b3812e3ff6bf66d97ebc35900af62bb16cdce2717516065a9c2ec565e Apr 22 19:10:53.238800 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.238767 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4mp4v" event={"ID":"f941c8a3-428c-47d0-a796-fd116d4256dc","Type":"ContainerStarted","Data":"de82e83b3812e3ff6bf66d97ebc35900af62bb16cdce2717516065a9c2ec565e"} Apr 22 19:10:53.239956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.239914 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s5lsg" event={"ID":"20d45b6d-4197-46bf-bb48-01fcb92e75d7","Type":"ContainerStarted","Data":"39d0801d45179bc8e4d90955e70741702c93b84b2957b1afe6038806469ccbbe"} Apr 22 19:10:53.248964 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.248926 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:10:53.249791 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.249600 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="kube-rbac-proxy" containerID="cri-o://36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9" gracePeriod=600 Apr 22 19:10:53.249791 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.249621 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="thanos-sidecar" containerID="cri-o://26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4" gracePeriod=600 Apr 22 19:10:53.249791 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.249642 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="kube-rbac-proxy-web" containerID="cri-o://656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964" gracePeriod=600 Apr 22 19:10:53.249791 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.249647 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="kube-rbac-proxy-thanos" containerID="cri-o://521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66" gracePeriod=600 Apr 22 19:10:53.249791 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.249592 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="prometheus" containerID="cri-o://58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10" gracePeriod=600 Apr 22 19:10:53.249791 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.249713 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="config-reloader" containerID="cri-o://32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b" gracePeriod=600 Apr 22 19:10:53.535104 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.535077 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:53.691395 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.691359 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-thanos-prometheus-http-client-file\") pod \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " Apr 22 19:10:53.691585 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.691411 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-prometheus-k8s-tls\") pod \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " Apr 22 19:10:53.691585 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.691447 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-metrics-client-certs\") pod \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " Apr 22 19:10:53.691887 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.691861 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-web-config\") pod \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " Apr 22 19:10:53.691956 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.691941 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg9rk\" (UniqueName: \"kubernetes.io/projected/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-kube-api-access-jg9rk\") pod \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " Apr 22 19:10:53.692055 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.692011 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-configmap-kubelet-serving-ca-bundle\") pod \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " Apr 22 19:10:53.692111 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.692077 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-prometheus-k8s-rulefiles-0\") pod \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " Apr 22 19:10:53.692111 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.692103 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-tls-assets\") pod \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " Apr 22 19:10:53.692210 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.692146 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-config\") pod \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " Apr 22 19:10:53.692210 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.692172 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-grpc-tls\") pod \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " Apr 22 19:10:53.692210 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.692199 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " Apr 22 19:10:53.692359 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.692232 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-config-out\") pod \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " Apr 22 19:10:53.692359 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.692256 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-prometheus-trusted-ca-bundle\") pod \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " Apr 22 19:10:53.692359 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.692287 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-configmap-serving-certs-ca-bundle\") pod \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " Apr 22 19:10:53.692521 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.692357 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-prometheus-k8s-db\") pod \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " Apr 22 19:10:53.692521 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.692382 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-kube-rbac-proxy\") pod \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " Apr 22 19:10:53.692521 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.692419 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " Apr 22 19:10:53.692521 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.692447 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-configmap-metrics-client-ca\") pod \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\" (UID: \"1e8b49c5-b5d5-4e56-a228-f15b79a38c14\") " Apr 22 19:10:53.693724 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.693441 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "1e8b49c5-b5d5-4e56-a228-f15b79a38c14" (UID: "1e8b49c5-b5d5-4e56-a228-f15b79a38c14"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:10:53.693724 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.693545 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "1e8b49c5-b5d5-4e56-a228-f15b79a38c14" (UID: "1e8b49c5-b5d5-4e56-a228-f15b79a38c14"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:10:53.694234 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.694202 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "1e8b49c5-b5d5-4e56-a228-f15b79a38c14" (UID: "1e8b49c5-b5d5-4e56-a228-f15b79a38c14"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:53.694358 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.694320 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "1e8b49c5-b5d5-4e56-a228-f15b79a38c14" (UID: "1e8b49c5-b5d5-4e56-a228-f15b79a38c14"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:53.695095 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.694834 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "1e8b49c5-b5d5-4e56-a228-f15b79a38c14" (UID: "1e8b49c5-b5d5-4e56-a228-f15b79a38c14"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:53.697367 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.697135 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-config-out" (OuterVolumeSpecName: "config-out") pod "1e8b49c5-b5d5-4e56-a228-f15b79a38c14" (UID: "1e8b49c5-b5d5-4e56-a228-f15b79a38c14"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:53.697945 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.697705 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "1e8b49c5-b5d5-4e56-a228-f15b79a38c14" (UID: "1e8b49c5-b5d5-4e56-a228-f15b79a38c14"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:53.698037 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.697996 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "1e8b49c5-b5d5-4e56-a228-f15b79a38c14" (UID: "1e8b49c5-b5d5-4e56-a228-f15b79a38c14"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:53.698469 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.697658 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "1e8b49c5-b5d5-4e56-a228-f15b79a38c14" (UID: "1e8b49c5-b5d5-4e56-a228-f15b79a38c14"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:10:53.698773 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.698713 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "1e8b49c5-b5d5-4e56-a228-f15b79a38c14" (UID: "1e8b49c5-b5d5-4e56-a228-f15b79a38c14"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:10:53.699355 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.699318 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "1e8b49c5-b5d5-4e56-a228-f15b79a38c14" (UID: "1e8b49c5-b5d5-4e56-a228-f15b79a38c14"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:10:53.700013 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.699959 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "1e8b49c5-b5d5-4e56-a228-f15b79a38c14" (UID: "1e8b49c5-b5d5-4e56-a228-f15b79a38c14"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:53.700013 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.699991 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1e8b49c5-b5d5-4e56-a228-f15b79a38c14" (UID: "1e8b49c5-b5d5-4e56-a228-f15b79a38c14"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:10:53.700164 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.700101 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "1e8b49c5-b5d5-4e56-a228-f15b79a38c14" (UID: "1e8b49c5-b5d5-4e56-a228-f15b79a38c14"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:53.700505 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.700486 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-config" (OuterVolumeSpecName: "config") pod "1e8b49c5-b5d5-4e56-a228-f15b79a38c14" (UID: "1e8b49c5-b5d5-4e56-a228-f15b79a38c14"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:53.701572 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.701547 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "1e8b49c5-b5d5-4e56-a228-f15b79a38c14" (UID: "1e8b49c5-b5d5-4e56-a228-f15b79a38c14"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:53.701718 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.701682 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-kube-api-access-jg9rk" (OuterVolumeSpecName: "kube-api-access-jg9rk") pod "1e8b49c5-b5d5-4e56-a228-f15b79a38c14" (UID: "1e8b49c5-b5d5-4e56-a228-f15b79a38c14"). InnerVolumeSpecName "kube-api-access-jg9rk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:10:53.711099 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.711075 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-web-config" (OuterVolumeSpecName: "web-config") pod "1e8b49c5-b5d5-4e56-a228-f15b79a38c14" (UID: "1e8b49c5-b5d5-4e56-a228-f15b79a38c14"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:53.793684 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.793611 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:10:53.793684 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.793640 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-config-out\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:10:53.793684 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.793657 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:10:53.793684 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.793673 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:10:53.793684 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.793684 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-prometheus-k8s-db\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:10:53.793936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.793702 2572 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-kube-rbac-proxy\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:10:53.793936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.793715 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:10:53.793936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.793725 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-configmap-metrics-client-ca\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:10:53.793936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.793734 2572 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:10:53.793936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.793744 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:10:53.793936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.793753 2572 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-metrics-client-certs\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:10:53.793936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.793769 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-web-config\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:10:53.793936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.793783 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jg9rk\" (UniqueName: \"kubernetes.io/projected/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-kube-api-access-jg9rk\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:10:53.793936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.793798 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:10:53.793936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.793813 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:10:53.793936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.793824 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-tls-assets\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:10:53.793936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.793837 2572 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-config\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:10:53.793936 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:53.793850 2572 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1e8b49c5-b5d5-4e56-a228-f15b79a38c14-secret-grpc-tls\") on node \"ip-10-0-129-110.ec2.internal\" DevicePath \"\"" Apr 22 19:10:54.246576 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.246542 2572 generic.go:358] "Generic (PLEG): container finished" podID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerID="521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66" exitCode=0 Apr 22 19:10:54.246576 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.246576 2572 generic.go:358] "Generic (PLEG): container finished" podID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerID="36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9" exitCode=0 Apr 22 19:10:54.246772 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.246588 2572 generic.go:358] "Generic (PLEG): container finished" podID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerID="656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964" exitCode=0 Apr 22 19:10:54.246772 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.246600 2572 generic.go:358] "Generic (PLEG): container finished" podID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerID="26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4" exitCode=0 Apr 22 19:10:54.246772 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.246609 2572 generic.go:358] "Generic (PLEG): container finished" podID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerID="32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b" exitCode=0 Apr 22 19:10:54.246772 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.246618 2572 generic.go:358] "Generic (PLEG): container finished" podID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerID="58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10" exitCode=0 Apr 22 19:10:54.246772 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.246616 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1e8b49c5-b5d5-4e56-a228-f15b79a38c14","Type":"ContainerDied","Data":"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66"} Apr 22 19:10:54.246772 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.246665 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1e8b49c5-b5d5-4e56-a228-f15b79a38c14","Type":"ContainerDied","Data":"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9"} Apr 22 19:10:54.246772 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.246682 2572 scope.go:117] "RemoveContainer" containerID="521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66" Apr 22 19:10:54.246772 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.246681 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1e8b49c5-b5d5-4e56-a228-f15b79a38c14","Type":"ContainerDied","Data":"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964"} Apr 22 19:10:54.247118 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.246789 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1e8b49c5-b5d5-4e56-a228-f15b79a38c14","Type":"ContainerDied","Data":"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4"} Apr 22 19:10:54.247118 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.246818 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1e8b49c5-b5d5-4e56-a228-f15b79a38c14","Type":"ContainerDied","Data":"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b"} Apr 22 19:10:54.247118 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.246829 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1e8b49c5-b5d5-4e56-a228-f15b79a38c14","Type":"ContainerDied","Data":"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10"} Apr 22 19:10:54.247118 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.246668 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.247118 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.246843 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1e8b49c5-b5d5-4e56-a228-f15b79a38c14","Type":"ContainerDied","Data":"4fdfc03c2e8071ded3b4e3597c8589a7235c23c713429d5fa8cb61a1ed60a227"} Apr 22 19:10:54.281830 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.281811 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:10:54.294891 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.294869 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:10:54.347052 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347024 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:10:54.347393 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347370 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="init-config-reloader" Apr 22 19:10:54.347393 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347394 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="init-config-reloader" Apr 22 19:10:54.347606 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347407 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="kube-rbac-proxy-web" Apr 22 19:10:54.347606 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347414 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="kube-rbac-proxy-web" Apr 22 19:10:54.347606 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347424 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="kube-rbac-proxy" Apr 22 19:10:54.347606 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347433 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="kube-rbac-proxy" Apr 22 19:10:54.347606 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347447 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="thanos-sidecar" Apr 22 19:10:54.347606 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347455 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="thanos-sidecar" Apr 22 19:10:54.347606 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347471 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="config-reloader" Apr 22 19:10:54.347606 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347480 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="config-reloader" Apr 22 19:10:54.347606 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347531 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="prometheus" Apr 22 19:10:54.347606 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347540 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="prometheus" Apr 22 19:10:54.347606 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347551 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="kube-rbac-proxy-thanos" Apr 22 19:10:54.347606 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347559 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="kube-rbac-proxy-thanos" Apr 22 19:10:54.348146 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347642 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="prometheus" Apr 22 19:10:54.348146 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347653 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="kube-rbac-proxy-thanos" Apr 22 19:10:54.348146 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347664 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="config-reloader" Apr 22 19:10:54.348146 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347673 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="thanos-sidecar" Apr 22 19:10:54.348146 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347684 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="kube-rbac-proxy" Apr 22 19:10:54.348146 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.347696 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" containerName="kube-rbac-proxy-web" Apr 22 19:10:54.352745 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.352719 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.360135 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.360112 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 19:10:54.360997 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.360662 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-7dkpn\"" Apr 22 19:10:54.360997 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.360722 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 19:10:54.363045 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.363012 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 19:10:54.363888 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.363869 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-55lqld3e4lnpu\"" Apr 22 19:10:54.365034 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.365018 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 19:10:54.365711 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.365690 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 19:10:54.365812 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.365789 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 19:10:54.366227 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.366209 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 19:10:54.366777 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.366759 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 19:10:54.370852 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.370824 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 19:10:54.370960 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.370862 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 19:10:54.371097 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.371078 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 19:10:54.374791 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.374775 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 19:10:54.389077 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.389058 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:10:54.498766 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.498680 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.498766 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.498727 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a79105b9-b3c3-4585-83ca-77f18f82fd03-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.498766 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.498749 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a79105b9-b3c3-4585-83ca-77f18f82fd03-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.499036 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.498813 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a79105b9-b3c3-4585-83ca-77f18f82fd03-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.499036 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.498857 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a79105b9-b3c3-4585-83ca-77f18f82fd03-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.499036 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.498889 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-config\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.499036 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.498937 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a79105b9-b3c3-4585-83ca-77f18f82fd03-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.499036 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.498985 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4jvm\" (UniqueName: \"kubernetes.io/projected/a79105b9-b3c3-4585-83ca-77f18f82fd03-kube-api-access-v4jvm\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.499280 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.499043 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.499280 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.499086 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a79105b9-b3c3-4585-83ca-77f18f82fd03-config-out\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.499280 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.499129 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.499280 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.499171 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.499280 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.499213 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.499280 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.499257 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.499586 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.499285 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a79105b9-b3c3-4585-83ca-77f18f82fd03-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.499586 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.499337 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a79105b9-b3c3-4585-83ca-77f18f82fd03-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.499586 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.499383 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-web-config\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.499586 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.499417 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.553150 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.553131 2572 scope.go:117] "RemoveContainer" containerID="36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9" Apr 22 19:10:54.581653 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.581621 2572 scope.go:117] "RemoveContainer" containerID="656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964" Apr 22 19:10:54.588177 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.588151 2572 scope.go:117] "RemoveContainer" containerID="26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4" Apr 22 19:10:54.594792 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.594777 2572 scope.go:117] "RemoveContainer" containerID="32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b" Apr 22 19:10:54.599875 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.599850 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.599970 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.599896 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.599970 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.599924 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a79105b9-b3c3-4585-83ca-77f18f82fd03-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.599970 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.599949 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a79105b9-b3c3-4585-83ca-77f18f82fd03-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.600107 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.599975 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a79105b9-b3c3-4585-83ca-77f18f82fd03-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.600107 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.600003 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a79105b9-b3c3-4585-83ca-77f18f82fd03-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.600107 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.600027 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-config\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.600107 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.600050 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a79105b9-b3c3-4585-83ca-77f18f82fd03-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.600107 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.600081 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4jvm\" (UniqueName: \"kubernetes.io/projected/a79105b9-b3c3-4585-83ca-77f18f82fd03-kube-api-access-v4jvm\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.600333 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.600110 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.600333 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.600137 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a79105b9-b3c3-4585-83ca-77f18f82fd03-config-out\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.600333 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.600171 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.600333 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.600200 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.600333 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.600226 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.600333 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.600250 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.600333 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.600272 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a79105b9-b3c3-4585-83ca-77f18f82fd03-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.600333 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.600298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a79105b9-b3c3-4585-83ca-77f18f82fd03-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.600717 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.600339 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-web-config\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.601782 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.601437 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a79105b9-b3c3-4585-83ca-77f18f82fd03-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.602152 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.602125 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a79105b9-b3c3-4585-83ca-77f18f82fd03-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.602849 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.602730 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a79105b9-b3c3-4585-83ca-77f18f82fd03-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.603197 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.603177 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-web-config\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.603673 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.603653 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a79105b9-b3c3-4585-83ca-77f18f82fd03-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.604767 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.604200 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.604767 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.604403 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.604767 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.604504 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a79105b9-b3c3-4585-83ca-77f18f82fd03-config-out\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.604946 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.604858 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a79105b9-b3c3-4585-83ca-77f18f82fd03-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.606635 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.605400 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a79105b9-b3c3-4585-83ca-77f18f82fd03-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.606635 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.606072 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-config\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.606781 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.606638 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a79105b9-b3c3-4585-83ca-77f18f82fd03-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.607373 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.607192 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.607373 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.607322 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.607583 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.607559 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.607583 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.607578 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.607828 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.607807 2572 scope.go:117] "RemoveContainer" containerID="58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10" Apr 22 19:10:54.609016 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.608989 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a79105b9-b3c3-4585-83ca-77f18f82fd03-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.609234 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.609215 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4jvm\" (UniqueName: \"kubernetes.io/projected/a79105b9-b3c3-4585-83ca-77f18f82fd03-kube-api-access-v4jvm\") pod \"prometheus-k8s-0\" (UID: \"a79105b9-b3c3-4585-83ca-77f18f82fd03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.640468 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.640447 2572 scope.go:117] "RemoveContainer" containerID="c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb" Apr 22 19:10:54.650809 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.650741 2572 scope.go:117] "RemoveContainer" containerID="521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66" Apr 22 19:10:54.651042 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:10:54.651011 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66\": container with ID starting with 521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66 not found: ID does not exist" containerID="521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66" Apr 22 19:10:54.651128 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.651049 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66"} err="failed to get container status \"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66\": rpc error: code = NotFound desc = could not find container \"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66\": container with ID starting with 521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66 not found: ID does not exist" Apr 22 19:10:54.651128 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.651095 2572 scope.go:117] "RemoveContainer" containerID="36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9" Apr 22 19:10:54.651367 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:10:54.651338 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9\": container with ID starting with 36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9 not found: ID does not exist" containerID="36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9" Apr 22 19:10:54.651456 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.651375 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9"} err="failed to get container status \"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9\": rpc error: code = NotFound desc = could not find container \"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9\": container with ID starting with 36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9 not found: ID does not exist" Apr 22 19:10:54.651456 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.651400 2572 scope.go:117] "RemoveContainer" containerID="656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964" Apr 22 19:10:54.651985 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:10:54.651911 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964\": container with ID starting with 656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964 not found: ID does not exist" containerID="656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964" Apr 22 19:10:54.651985 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.651943 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964"} err="failed to get container status \"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964\": rpc error: code = NotFound desc = could not find container \"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964\": container with ID starting with 656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964 not found: ID does not exist" Apr 22 19:10:54.651985 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.651965 2572 scope.go:117] "RemoveContainer" containerID="26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4" Apr 22 19:10:54.652199 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:10:54.652178 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4\": container with ID starting with 26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4 not found: ID does not exist" containerID="26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4" Apr 22 19:10:54.652257 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.652206 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4"} err="failed to get container status \"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4\": rpc error: code = NotFound desc = could not find container \"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4\": container with ID starting with 26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4 not found: ID does not exist" Apr 22 19:10:54.652257 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.652235 2572 scope.go:117] "RemoveContainer" containerID="32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b" Apr 22 19:10:54.652450 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:10:54.652431 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b\": container with ID starting with 32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b not found: ID does not exist" containerID="32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b" Apr 22 19:10:54.652615 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.652457 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b"} err="failed to get container status \"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b\": rpc error: code = NotFound desc = could not find container \"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b\": container with ID starting with 32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b not found: ID does not exist" Apr 22 19:10:54.652615 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.652475 2572 scope.go:117] "RemoveContainer" containerID="58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10" Apr 22 19:10:54.653005 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:10:54.652892 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10\": container with ID starting with 58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10 not found: ID does not exist" containerID="58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10" Apr 22 19:10:54.653005 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.652922 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10"} err="failed to get container status \"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10\": rpc error: code = NotFound desc = could not find container \"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10\": container with ID starting with 58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10 not found: ID does not exist" Apr 22 19:10:54.653005 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.652941 2572 scope.go:117] "RemoveContainer" containerID="c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb" Apr 22 19:10:54.653217 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:10:54.653183 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb\": container with ID starting with c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb not found: ID does not exist" containerID="c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb" Apr 22 19:10:54.653269 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.653213 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb"} err="failed to get container status \"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb\": rpc error: code = NotFound desc = could not find container \"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb\": container with ID starting with c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb not found: ID does not exist" Apr 22 19:10:54.653324 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.653267 2572 scope.go:117] "RemoveContainer" containerID="521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66" Apr 22 19:10:54.653494 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.653465 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66"} err="failed to get container status \"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66\": rpc error: code = NotFound desc = could not find container \"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66\": container with ID starting with 521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66 not found: ID does not exist" Apr 22 19:10:54.653595 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.653494 2572 scope.go:117] "RemoveContainer" containerID="36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9" Apr 22 19:10:54.653838 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.653811 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9"} err="failed to get container status \"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9\": rpc error: code = NotFound desc = could not find container \"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9\": container with ID starting with 36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9 not found: ID does not exist" Apr 22 19:10:54.653922 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.653839 2572 scope.go:117] "RemoveContainer" containerID="656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964" Apr 22 19:10:54.654133 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.654082 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964"} err="failed to get container status \"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964\": rpc error: code = NotFound desc = could not find container \"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964\": container with ID starting with 656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964 not found: ID does not exist" Apr 22 19:10:54.654133 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.654112 2572 scope.go:117] "RemoveContainer" containerID="26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4" Apr 22 19:10:54.654375 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.654344 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4"} err="failed to get container status \"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4\": rpc error: code = NotFound desc = could not find container \"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4\": container with ID starting with 26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4 not found: ID does not exist" Apr 22 19:10:54.654458 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.654373 2572 scope.go:117] "RemoveContainer" containerID="32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b" Apr 22 19:10:54.654934 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.654907 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b"} err="failed to get container status \"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b\": rpc error: code = NotFound desc = could not find container \"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b\": container with ID starting with 32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b not found: ID does not exist" Apr 22 19:10:54.654995 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.654937 2572 scope.go:117] "RemoveContainer" containerID="58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10" Apr 22 19:10:54.655335 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.655297 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10"} err="failed to get container status \"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10\": rpc error: code = NotFound desc = could not find container \"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10\": container with ID starting with 58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10 not found: ID does not exist" Apr 22 19:10:54.655335 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.655334 2572 scope.go:117] "RemoveContainer" containerID="c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb" Apr 22 19:10:54.655721 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.655694 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb"} err="failed to get container status \"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb\": rpc error: code = NotFound desc = could not find container \"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb\": container with ID starting with c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb not found: ID does not exist" Apr 22 19:10:54.655800 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.655724 2572 scope.go:117] "RemoveContainer" containerID="521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66" Apr 22 19:10:54.656026 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.655951 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66"} err="failed to get container status \"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66\": rpc error: code = NotFound desc = could not find container \"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66\": container with ID starting with 521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66 not found: ID does not exist" Apr 22 19:10:54.656082 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.656027 2572 scope.go:117] "RemoveContainer" containerID="36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9" Apr 22 19:10:54.656289 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.656261 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9"} err="failed to get container status \"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9\": rpc error: code = NotFound desc = could not find container \"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9\": container with ID starting with 36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9 not found: ID does not exist" Apr 22 19:10:54.656289 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.656288 2572 scope.go:117] "RemoveContainer" containerID="656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964" Apr 22 19:10:54.656571 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.656547 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964"} err="failed to get container status \"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964\": rpc error: code = NotFound desc = could not find container \"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964\": container with ID starting with 656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964 not found: ID does not exist" Apr 22 19:10:54.656571 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.656570 2572 scope.go:117] "RemoveContainer" containerID="26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4" Apr 22 19:10:54.656883 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.656790 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4"} err="failed to get container status \"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4\": rpc error: code = NotFound desc = could not find container \"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4\": container with ID starting with 26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4 not found: ID does not exist" Apr 22 19:10:54.656883 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.656815 2572 scope.go:117] "RemoveContainer" containerID="32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b" Apr 22 19:10:54.657040 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.657021 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b"} err="failed to get container status \"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b\": rpc error: code = NotFound desc = could not find container \"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b\": container with ID starting with 32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b not found: ID does not exist" Apr 22 19:10:54.657103 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.657040 2572 scope.go:117] "RemoveContainer" containerID="58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10" Apr 22 19:10:54.657285 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.657262 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10"} err="failed to get container status \"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10\": rpc error: code = NotFound desc = could not find container \"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10\": container with ID starting with 58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10 not found: ID does not exist" Apr 22 19:10:54.657368 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.657288 2572 scope.go:117] "RemoveContainer" containerID="c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb" Apr 22 19:10:54.657537 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.657492 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb"} err="failed to get container status \"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb\": rpc error: code = NotFound desc = could not find container \"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb\": container with ID starting with c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb not found: ID does not exist" Apr 22 19:10:54.657618 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.657540 2572 scope.go:117] "RemoveContainer" containerID="521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66" Apr 22 19:10:54.657809 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.657782 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66"} err="failed to get container status \"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66\": rpc error: code = NotFound desc = could not find container \"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66\": container with ID starting with 521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66 not found: ID does not exist" Apr 22 19:10:54.657809 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.657809 2572 scope.go:117] "RemoveContainer" containerID="36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9" Apr 22 19:10:54.658071 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.658049 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9"} err="failed to get container status \"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9\": rpc error: code = NotFound desc = could not find container \"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9\": container with ID starting with 36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9 not found: ID does not exist" Apr 22 19:10:54.658150 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.658072 2572 scope.go:117] "RemoveContainer" containerID="656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964" Apr 22 19:10:54.658311 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.658289 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964"} err="failed to get container status \"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964\": rpc error: code = NotFound desc = could not find container \"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964\": container with ID starting with 656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964 not found: ID does not exist" Apr 22 19:10:54.658362 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.658315 2572 scope.go:117] "RemoveContainer" containerID="26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4" Apr 22 19:10:54.658586 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.658565 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4"} err="failed to get container status \"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4\": rpc error: code = NotFound desc = could not find container \"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4\": container with ID starting with 26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4 not found: ID does not exist" Apr 22 19:10:54.658652 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.658586 2572 scope.go:117] "RemoveContainer" containerID="32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b" Apr 22 19:10:54.658809 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.658792 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b"} err="failed to get container status \"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b\": rpc error: code = NotFound desc = could not find container \"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b\": container with ID starting with 32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b not found: ID does not exist" Apr 22 19:10:54.658863 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.658810 2572 scope.go:117] "RemoveContainer" containerID="58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10" Apr 22 19:10:54.659078 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.659054 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10"} err="failed to get container status \"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10\": rpc error: code = NotFound desc = could not find container \"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10\": container with ID starting with 58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10 not found: ID does not exist" Apr 22 19:10:54.659144 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.659080 2572 scope.go:117] "RemoveContainer" containerID="c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb" Apr 22 19:10:54.659336 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.659312 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb"} err="failed to get container status \"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb\": rpc error: code = NotFound desc = could not find container \"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb\": container with ID starting with c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb not found: ID does not exist" Apr 22 19:10:54.659390 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.659338 2572 scope.go:117] "RemoveContainer" containerID="521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66" Apr 22 19:10:54.659625 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.659597 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66"} err="failed to get container status \"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66\": rpc error: code = NotFound desc = could not find container \"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66\": container with ID starting with 521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66 not found: ID does not exist" Apr 22 19:10:54.659687 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.659627 2572 scope.go:117] "RemoveContainer" containerID="36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9" Apr 22 19:10:54.659882 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.659857 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9"} err="failed to get container status \"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9\": rpc error: code = NotFound desc = could not find container \"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9\": container with ID starting with 36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9 not found: ID does not exist" Apr 22 19:10:54.659971 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.659885 2572 scope.go:117] "RemoveContainer" containerID="656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964" Apr 22 19:10:54.660145 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.660125 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964"} err="failed to get container status \"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964\": rpc error: code = NotFound desc = could not find container \"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964\": container with ID starting with 656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964 not found: ID does not exist" Apr 22 19:10:54.660145 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.660143 2572 scope.go:117] "RemoveContainer" containerID="26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4" Apr 22 19:10:54.660384 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.660368 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4"} err="failed to get container status \"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4\": rpc error: code = NotFound desc = could not find container \"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4\": container with ID starting with 26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4 not found: ID does not exist" Apr 22 19:10:54.660427 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.660386 2572 scope.go:117] "RemoveContainer" containerID="32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b" Apr 22 19:10:54.660658 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.660634 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b"} err="failed to get container status \"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b\": rpc error: code = NotFound desc = could not find container \"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b\": container with ID starting with 32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b not found: ID does not exist" Apr 22 19:10:54.660729 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.660661 2572 scope.go:117] "RemoveContainer" containerID="58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10" Apr 22 19:10:54.660960 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.660870 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10"} err="failed to get container status \"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10\": rpc error: code = NotFound desc = could not find container \"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10\": container with ID starting with 58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10 not found: ID does not exist" Apr 22 19:10:54.660960 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.660893 2572 scope.go:117] "RemoveContainer" containerID="c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb" Apr 22 19:10:54.661113 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.661097 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb"} err="failed to get container status \"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb\": rpc error: code = NotFound desc = could not find container \"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb\": container with ID starting with c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb not found: ID does not exist" Apr 22 19:10:54.661173 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.661115 2572 scope.go:117] "RemoveContainer" containerID="521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66" Apr 22 19:10:54.661346 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.661319 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66"} err="failed to get container status \"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66\": rpc error: code = NotFound desc = could not find container \"521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66\": container with ID starting with 521f7b9bb5107c97ced23d52be0be0cf781d5d8d2a1f79a45fa9aeffe4178a66 not found: ID does not exist" Apr 22 19:10:54.661346 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.661345 2572 scope.go:117] "RemoveContainer" containerID="36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9" Apr 22 19:10:54.661606 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.661584 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9"} err="failed to get container status \"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9\": rpc error: code = NotFound desc = could not find container \"36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9\": container with ID starting with 36ed4fea4ec9812f459ca6d3c9853cc86fa813c96fb4d840b247b944a7e536f9 not found: ID does not exist" Apr 22 19:10:54.661606 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.661604 2572 scope.go:117] "RemoveContainer" containerID="656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964" Apr 22 19:10:54.661807 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.661793 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964"} err="failed to get container status \"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964\": rpc error: code = NotFound desc = could not find container \"656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964\": container with ID starting with 656a64322ae2f1a3ff2c5d4dbc37a9c214f0e4bad737431d5f969d8d6a572964 not found: ID does not exist" Apr 22 19:10:54.661862 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.661810 2572 scope.go:117] "RemoveContainer" containerID="26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4" Apr 22 19:10:54.662093 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.662018 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4"} err="failed to get container status \"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4\": rpc error: code = NotFound desc = could not find container \"26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4\": container with ID starting with 26a3560346051279d959b60c7de68bcdb6b18cd561a300361d6d374e7b05c8d4 not found: ID does not exist" Apr 22 19:10:54.662093 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.662042 2572 scope.go:117] "RemoveContainer" containerID="32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b" Apr 22 19:10:54.662269 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.662242 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b"} err="failed to get container status \"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b\": rpc error: code = NotFound desc = could not find container \"32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b\": container with ID starting with 32f02552a70814f24b951a189226b7b6fd59fb8c935842cb562108b235200e9b not found: ID does not exist" Apr 22 19:10:54.662337 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.662270 2572 scope.go:117] "RemoveContainer" containerID="58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10" Apr 22 19:10:54.662548 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.662495 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10"} err="failed to get container status \"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10\": rpc error: code = NotFound desc = could not find container \"58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10\": container with ID starting with 58d1ba8cbde8f0be9ecbf0caa5f341689ad861f0d5217f1e62155d74497f3f10 not found: ID does not exist" Apr 22 19:10:54.662618 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.662549 2572 scope.go:117] "RemoveContainer" containerID="c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb" Apr 22 19:10:54.662904 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.662835 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb"} err="failed to get container status \"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb\": rpc error: code = NotFound desc = could not find container \"c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb\": container with ID starting with c85126d4e5e54c04e0201ddf152406a03a960af881760612d886485fd78d8cbb not found: ID does not exist" Apr 22 19:10:54.664398 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.664379 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:10:54.821141 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:54.820727 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:10:54.822903 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:10:54.822862 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda79105b9_b3c3_4585_83ca_77f18f82fd03.slice/crio-48220a8f9e88c95c78dd04fb2ab1337519c0b6ac85613db5ea430d873ce37f98 WatchSource:0}: Error finding container 48220a8f9e88c95c78dd04fb2ab1337519c0b6ac85613db5ea430d873ce37f98: Status 404 returned error can't find the container with id 48220a8f9e88c95c78dd04fb2ab1337519c0b6ac85613db5ea430d873ce37f98 Apr 22 19:10:55.250300 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:55.250267 2572 generic.go:358] "Generic (PLEG): container finished" podID="a79105b9-b3c3-4585-83ca-77f18f82fd03" containerID="dd1950e21ace32ca0c58f6f43840980099d53fdd58998423f632a0ce11f643aa" exitCode=0 Apr 22 19:10:55.250445 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:55.250351 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a79105b9-b3c3-4585-83ca-77f18f82fd03","Type":"ContainerDied","Data":"dd1950e21ace32ca0c58f6f43840980099d53fdd58998423f632a0ce11f643aa"} Apr 22 19:10:55.250445 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:55.250384 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a79105b9-b3c3-4585-83ca-77f18f82fd03","Type":"ContainerStarted","Data":"48220a8f9e88c95c78dd04fb2ab1337519c0b6ac85613db5ea430d873ce37f98"} Apr 22 19:10:55.251633 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:55.251607 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4mp4v" event={"ID":"f941c8a3-428c-47d0-a796-fd116d4256dc","Type":"ContainerStarted","Data":"5ea0ddde577a839687dbaf58284c9d539b887e1998ac3c7656d958a4e1a4c7b1"} Apr 22 19:10:55.253820 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:55.253798 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s5lsg" event={"ID":"20d45b6d-4197-46bf-bb48-01fcb92e75d7","Type":"ContainerStarted","Data":"bdf0b3b5490fc86195464557f55620728c9b131c8608d8537259f372618092eb"} Apr 22 19:10:55.253930 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:55.253827 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s5lsg" event={"ID":"20d45b6d-4197-46bf-bb48-01fcb92e75d7","Type":"ContainerStarted","Data":"fab19e00e192bad174d435191b5a9a68989d715b627e39b853a39c52bec2b209"} Apr 22 19:10:55.254007 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:55.253990 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-s5lsg" Apr 22 19:10:55.299967 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:55.299927 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-s5lsg" podStartSLOduration=251.379538777 podStartE2EDuration="4m13.299912881s" podCreationTimestamp="2026-04-22 19:06:42 +0000 UTC" firstStartedPulling="2026-04-22 19:10:52.669118046 +0000 UTC m=+283.716874236" lastFinishedPulling="2026-04-22 19:10:54.589492143 +0000 UTC m=+285.637248340" observedRunningTime="2026-04-22 19:10:55.29891558 +0000 UTC m=+286.346671790" watchObservedRunningTime="2026-04-22 19:10:55.299912881 +0000 UTC m=+286.347669086" Apr 22 19:10:55.336411 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:55.336352 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4mp4v" podStartSLOduration=251.428315614 podStartE2EDuration="4m13.336332199s" podCreationTimestamp="2026-04-22 19:06:42 +0000 UTC" firstStartedPulling="2026-04-22 19:10:52.683544183 +0000 UTC m=+283.731300368" lastFinishedPulling="2026-04-22 19:10:54.591560754 +0000 UTC m=+285.639316953" observedRunningTime="2026-04-22 19:10:55.335730282 +0000 UTC m=+286.383486489" watchObservedRunningTime="2026-04-22 19:10:55.336332199 +0000 UTC m=+286.384088406" Apr 22 19:10:55.463962 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:55.463099 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e8b49c5-b5d5-4e56-a228-f15b79a38c14" path="/var/lib/kubelet/pods/1e8b49c5-b5d5-4e56-a228-f15b79a38c14/volumes" Apr 22 19:10:56.260210 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:56.260169 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a79105b9-b3c3-4585-83ca-77f18f82fd03","Type":"ContainerStarted","Data":"75e9ac4252d64c614907dd319770e1b50714f1157708f167b6060bf200b9ea00"} Apr 22 19:10:56.260210 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:56.260215 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a79105b9-b3c3-4585-83ca-77f18f82fd03","Type":"ContainerStarted","Data":"eb09568e2206fde49aa6125e95d58fab2c22f816a2bc1c3601c483df6db17627"} Apr 22 19:10:56.260632 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:56.260231 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a79105b9-b3c3-4585-83ca-77f18f82fd03","Type":"ContainerStarted","Data":"d689b0f6e996df1865f92efd01a24455e08f32619208b7c6ce4241710a39a321"} Apr 22 19:10:56.260632 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:56.260241 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a79105b9-b3c3-4585-83ca-77f18f82fd03","Type":"ContainerStarted","Data":"57aa8fc39bd1092ee07df996199576b22087da1792873c5be634f969f071c7af"} Apr 22 19:10:56.260632 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:56.260249 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a79105b9-b3c3-4585-83ca-77f18f82fd03","Type":"ContainerStarted","Data":"115ad1955441dc0f01494bd41b43d10e5b75467cf0003a439110154e0c4e65b3"} Apr 22 19:10:56.260632 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:56.260257 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a79105b9-b3c3-4585-83ca-77f18f82fd03","Type":"ContainerStarted","Data":"9641bad865ca62bc03506837ee689664869a65ccde1ed6a91c9b1e91927f7641"} Apr 22 19:10:56.292446 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:56.292306 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.2922845929999998 podStartE2EDuration="2.292284593s" podCreationTimestamp="2026-04-22 19:10:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:10:56.290206358 +0000 UTC m=+287.337962598" watchObservedRunningTime="2026-04-22 19:10:56.292284593 +0000 UTC m=+287.340040801" Apr 22 19:10:59.664672 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:10:59.664637 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:11:05.263000 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:11:05.262973 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-s5lsg" Apr 22 19:11:09.381588 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:11:09.381555 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-s8877_9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4/cluster-monitoring-operator/0.log" Apr 22 19:11:09.382147 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:11:09.381713 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-s8877_9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4/cluster-monitoring-operator/0.log" Apr 22 19:11:09.399670 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:11:09.399652 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 19:11:54.664678 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:11:54.664641 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:11:54.679873 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:11:54.679851 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:11:55.436790 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:11:55.436762 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:15:55.393303 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:15:55.393269 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-zc6x8"] Apr 22 19:15:55.396385 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:15:55.396369 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-zc6x8" Apr 22 19:15:55.400381 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:15:55.400360 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-jcjsk\"" Apr 22 19:15:55.400482 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:15:55.400431 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 19:15:55.413183 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:15:55.413168 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 19:15:55.420775 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:15:55.420757 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-zc6x8"] Apr 22 19:15:55.456119 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:15:55.456086 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa69cace-b5b6-4491-b277-8be05a04a627-bound-sa-token\") pod \"cert-manager-79c8d999ff-zc6x8\" (UID: \"aa69cace-b5b6-4491-b277-8be05a04a627\") " pod="cert-manager/cert-manager-79c8d999ff-zc6x8" Apr 22 19:15:55.456229 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:15:55.456182 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckjvf\" (UniqueName: \"kubernetes.io/projected/aa69cace-b5b6-4491-b277-8be05a04a627-kube-api-access-ckjvf\") pod \"cert-manager-79c8d999ff-zc6x8\" (UID: \"aa69cace-b5b6-4491-b277-8be05a04a627\") " pod="cert-manager/cert-manager-79c8d999ff-zc6x8" Apr 22 19:15:55.557186 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:15:55.557160 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckjvf\" (UniqueName: \"kubernetes.io/projected/aa69cace-b5b6-4491-b277-8be05a04a627-kube-api-access-ckjvf\") pod \"cert-manager-79c8d999ff-zc6x8\" (UID: \"aa69cace-b5b6-4491-b277-8be05a04a627\") " pod="cert-manager/cert-manager-79c8d999ff-zc6x8" Apr 22 19:15:55.557466 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:15:55.557443 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa69cace-b5b6-4491-b277-8be05a04a627-bound-sa-token\") pod \"cert-manager-79c8d999ff-zc6x8\" (UID: \"aa69cace-b5b6-4491-b277-8be05a04a627\") " pod="cert-manager/cert-manager-79c8d999ff-zc6x8" Apr 22 19:15:55.566634 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:15:55.566608 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa69cace-b5b6-4491-b277-8be05a04a627-bound-sa-token\") pod \"cert-manager-79c8d999ff-zc6x8\" (UID: \"aa69cace-b5b6-4491-b277-8be05a04a627\") " pod="cert-manager/cert-manager-79c8d999ff-zc6x8" Apr 22 19:15:55.566706 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:15:55.566682 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckjvf\" (UniqueName: \"kubernetes.io/projected/aa69cace-b5b6-4491-b277-8be05a04a627-kube-api-access-ckjvf\") pod \"cert-manager-79c8d999ff-zc6x8\" (UID: \"aa69cace-b5b6-4491-b277-8be05a04a627\") " pod="cert-manager/cert-manager-79c8d999ff-zc6x8" Apr 22 19:15:55.717295 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:15:55.717273 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-zc6x8" Apr 22 19:15:55.835105 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:15:55.835042 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa69cace_b5b6_4491_b277_8be05a04a627.slice/crio-07cbd64beb3e0a2bbfcca5c76ee2ff34c3924bff7e39035ebba8f138263ba0a4 WatchSource:0}: Error finding container 07cbd64beb3e0a2bbfcca5c76ee2ff34c3924bff7e39035ebba8f138263ba0a4: Status 404 returned error can't find the container with id 07cbd64beb3e0a2bbfcca5c76ee2ff34c3924bff7e39035ebba8f138263ba0a4 Apr 22 19:15:55.836613 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:15:55.836589 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-zc6x8"] Apr 22 19:15:55.837163 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:15:55.837148 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:15:56.089846 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:15:56.089779 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-zc6x8" event={"ID":"aa69cace-b5b6-4491-b277-8be05a04a627","Type":"ContainerStarted","Data":"07cbd64beb3e0a2bbfcca5c76ee2ff34c3924bff7e39035ebba8f138263ba0a4"} Apr 22 19:15:59.102102 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:15:59.102018 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-zc6x8" event={"ID":"aa69cace-b5b6-4491-b277-8be05a04a627","Type":"ContainerStarted","Data":"ec348831299056377e717c6fe27877044da1354a1f3cdfac731b25a078b6e208"} Apr 22 19:15:59.120367 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:15:59.120310 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-zc6x8" podStartSLOduration=1.230265608 podStartE2EDuration="4.120292725s" podCreationTimestamp="2026-04-22 19:15:55 +0000 UTC" firstStartedPulling="2026-04-22 19:15:55.837272922 +0000 UTC m=+586.885029106" lastFinishedPulling="2026-04-22 19:15:58.727300036 +0000 UTC m=+589.775056223" observedRunningTime="2026-04-22 19:15:59.119589752 +0000 UTC m=+590.167345958" watchObservedRunningTime="2026-04-22 19:15:59.120292725 +0000 UTC m=+590.168048933" Apr 22 19:16:06.832864 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:06.832822 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c"] Apr 22 19:16:06.836115 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:06.836095 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c" Apr 22 19:16:06.846884 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:06.846865 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 19:16:06.846967 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:06.846892 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 19:16:06.846967 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:06.846897 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 19:16:06.847441 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:06.847424 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-dvr7t\"" Apr 22 19:16:06.847506 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:06.847484 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 19:16:06.868017 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:06.867991 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c"] Apr 22 19:16:06.956232 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:06.956204 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d186bd35-b3de-4042-a859-79a792233992-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-wkn2c\" (UID: \"d186bd35-b3de-4042-a859-79a792233992\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c" Apr 22 19:16:06.956377 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:06.956281 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d186bd35-b3de-4042-a859-79a792233992-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-wkn2c\" (UID: \"d186bd35-b3de-4042-a859-79a792233992\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c" Apr 22 19:16:06.956377 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:06.956365 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkhxn\" (UniqueName: \"kubernetes.io/projected/d186bd35-b3de-4042-a859-79a792233992-kube-api-access-hkhxn\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-wkn2c\" (UID: \"d186bd35-b3de-4042-a859-79a792233992\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c" Apr 22 19:16:07.056787 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:07.056755 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d186bd35-b3de-4042-a859-79a792233992-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-wkn2c\" (UID: \"d186bd35-b3de-4042-a859-79a792233992\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c" Apr 22 19:16:07.056941 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:07.056812 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d186bd35-b3de-4042-a859-79a792233992-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-wkn2c\" (UID: \"d186bd35-b3de-4042-a859-79a792233992\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c" Apr 22 19:16:07.056941 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:07.056847 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkhxn\" (UniqueName: \"kubernetes.io/projected/d186bd35-b3de-4042-a859-79a792233992-kube-api-access-hkhxn\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-wkn2c\" (UID: \"d186bd35-b3de-4042-a859-79a792233992\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c" Apr 22 19:16:07.059147 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:07.059115 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d186bd35-b3de-4042-a859-79a792233992-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-wkn2c\" (UID: \"d186bd35-b3de-4042-a859-79a792233992\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c" Apr 22 19:16:07.059266 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:07.059164 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d186bd35-b3de-4042-a859-79a792233992-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-wkn2c\" (UID: \"d186bd35-b3de-4042-a859-79a792233992\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c" Apr 22 19:16:07.065693 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:07.065667 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkhxn\" (UniqueName: \"kubernetes.io/projected/d186bd35-b3de-4042-a859-79a792233992-kube-api-access-hkhxn\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-wkn2c\" (UID: \"d186bd35-b3de-4042-a859-79a792233992\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c" Apr 22 19:16:07.146121 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:07.146067 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c" Apr 22 19:16:07.277854 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:07.277830 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c"] Apr 22 19:16:07.280177 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:16:07.280146 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd186bd35_b3de_4042_a859_79a792233992.slice/crio-140d4d7bbd998fb84e3e30eec4fc35b290f40c859c8b44b506a56d7399218171 WatchSource:0}: Error finding container 140d4d7bbd998fb84e3e30eec4fc35b290f40c859c8b44b506a56d7399218171: Status 404 returned error can't find the container with id 140d4d7bbd998fb84e3e30eec4fc35b290f40c859c8b44b506a56d7399218171 Apr 22 19:16:08.130486 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:08.130440 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c" event={"ID":"d186bd35-b3de-4042-a859-79a792233992","Type":"ContainerStarted","Data":"140d4d7bbd998fb84e3e30eec4fc35b290f40c859c8b44b506a56d7399218171"} Apr 22 19:16:09.654576 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:09.654551 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-s8877_9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4/cluster-monitoring-operator/0.log" Apr 22 19:16:09.655002 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:09.654549 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-s8877_9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4/cluster-monitoring-operator/0.log" Apr 22 19:16:10.138389 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:10.138351 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c" event={"ID":"d186bd35-b3de-4042-a859-79a792233992","Type":"ContainerStarted","Data":"d2fb944c5b0341ac5892c7d702d581296463afd9fd90194b0b5a6a7eae21c18f"} Apr 22 19:16:10.138610 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:10.138441 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c" Apr 22 19:16:13.363105 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:13.363041 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c" podStartSLOduration=4.951633076 podStartE2EDuration="7.363020126s" podCreationTimestamp="2026-04-22 19:16:06 +0000 UTC" firstStartedPulling="2026-04-22 19:16:07.281718547 +0000 UTC m=+598.329474732" lastFinishedPulling="2026-04-22 19:16:09.693105596 +0000 UTC m=+600.740861782" observedRunningTime="2026-04-22 19:16:10.172354157 +0000 UTC m=+601.220110379" watchObservedRunningTime="2026-04-22 19:16:13.363020126 +0000 UTC m=+604.410776333" Apr 22 19:16:13.364768 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:13.364741 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf"] Apr 22 19:16:13.368100 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:13.368084 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:13.383542 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:16:13.383488 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"metrics-server-cert\" is forbidden: User \"system:node:ip-10-0-129-110.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-lws-operator\": no relationship found between node 'ip-10-0-129-110.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" type="*v1.Secret" Apr 22 19:16:13.383634 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:16:13.383590 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"lws-controller-manager-dockercfg-kpm8c\" is forbidden: User \"system:node:ip-10-0-129-110.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-lws-operator\": no relationship found between node 'ip-10-0-129-110.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-kpm8c\"" type="*v1.Secret" Apr 22 19:16:13.383634 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:16:13.383594 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"lws-manager-config\" is forbidden: User \"system:node:ip-10-0-129-110.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-lws-operator\": no relationship found between node 'ip-10-0-129-110.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" type="*v1.ConfigMap" Apr 22 19:16:13.383765 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:16:13.383637 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-129-110.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-lws-operator\": no relationship found between node 'ip-10-0-129-110.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 22 19:16:13.383765 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:16:13.383640 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"webhook-server-cert\" is forbidden: User \"system:node:ip-10-0-129-110.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-lws-operator\": no relationship found between node 'ip-10-0-129-110.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" type="*v1.Secret" Apr 22 19:16:13.383973 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:13.383950 2572 status_manager.go:895] "Failed to get status for pod" podUID="1cf8e953-2308-4745-999d-3f136fcb7312" pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" err="pods \"lws-controller-manager-6769c56bf6-sq8mf\" is forbidden: User \"system:node:ip-10-0-129-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-lws-operator\": no relationship found between node 'ip-10-0-129-110.ec2.internal' and this object" Apr 22 19:16:13.387912 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:13.387895 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:16:13.394251 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:13.394227 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf"] Apr 22 19:16:13.508298 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:13.508263 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1cf8e953-2308-4745-999d-3f136fcb7312-cert\") pod \"lws-controller-manager-6769c56bf6-sq8mf\" (UID: \"1cf8e953-2308-4745-999d-3f136fcb7312\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:13.508460 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:13.508330 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/1cf8e953-2308-4745-999d-3f136fcb7312-metrics-cert\") pod \"lws-controller-manager-6769c56bf6-sq8mf\" (UID: \"1cf8e953-2308-4745-999d-3f136fcb7312\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:13.508460 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:13.508384 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjswr\" (UniqueName: \"kubernetes.io/projected/1cf8e953-2308-4745-999d-3f136fcb7312-kube-api-access-fjswr\") pod \"lws-controller-manager-6769c56bf6-sq8mf\" (UID: \"1cf8e953-2308-4745-999d-3f136fcb7312\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:13.508460 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:13.508406 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1cf8e953-2308-4745-999d-3f136fcb7312-manager-config\") pod \"lws-controller-manager-6769c56bf6-sq8mf\" (UID: \"1cf8e953-2308-4745-999d-3f136fcb7312\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:13.609462 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:13.609430 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjswr\" (UniqueName: \"kubernetes.io/projected/1cf8e953-2308-4745-999d-3f136fcb7312-kube-api-access-fjswr\") pod \"lws-controller-manager-6769c56bf6-sq8mf\" (UID: \"1cf8e953-2308-4745-999d-3f136fcb7312\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:13.609602 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:13.609466 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1cf8e953-2308-4745-999d-3f136fcb7312-manager-config\") pod \"lws-controller-manager-6769c56bf6-sq8mf\" (UID: \"1cf8e953-2308-4745-999d-3f136fcb7312\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:13.609651 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:13.609609 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1cf8e953-2308-4745-999d-3f136fcb7312-cert\") pod \"lws-controller-manager-6769c56bf6-sq8mf\" (UID: \"1cf8e953-2308-4745-999d-3f136fcb7312\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:13.609694 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:13.609664 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/1cf8e953-2308-4745-999d-3f136fcb7312-metrics-cert\") pod \"lws-controller-manager-6769c56bf6-sq8mf\" (UID: \"1cf8e953-2308-4745-999d-3f136fcb7312\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:14.285975 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:14.285944 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-kpm8c\"" Apr 22 19:16:14.355598 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:14.355572 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 19:16:14.362094 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:14.362071 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/1cf8e953-2308-4745-999d-3f136fcb7312-metrics-cert\") pod \"lws-controller-manager-6769c56bf6-sq8mf\" (UID: \"1cf8e953-2308-4745-999d-3f136fcb7312\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:14.610664 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:16:14.610586 2572 secret.go:189] Couldn't get secret openshift-lws-operator/webhook-server-cert: failed to sync secret cache: timed out waiting for the condition Apr 22 19:16:14.610664 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:16:14.610652 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1cf8e953-2308-4745-999d-3f136fcb7312-cert podName:1cf8e953-2308-4745-999d-3f136fcb7312 nodeName:}" failed. No retries permitted until 2026-04-22 19:16:15.110638065 +0000 UTC m=+606.158394249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1cf8e953-2308-4745-999d-3f136fcb7312-cert") pod "lws-controller-manager-6769c56bf6-sq8mf" (UID: "1cf8e953-2308-4745-999d-3f136fcb7312") : failed to sync secret cache: timed out waiting for the condition Apr 22 19:16:14.611035 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:16:14.610596 2572 configmap.go:193] Couldn't get configMap openshift-lws-operator/lws-manager-config: failed to sync configmap cache: timed out waiting for the condition Apr 22 19:16:14.611035 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:16:14.610725 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1cf8e953-2308-4745-999d-3f136fcb7312-manager-config podName:1cf8e953-2308-4745-999d-3f136fcb7312 nodeName:}" failed. No retries permitted until 2026-04-22 19:16:15.110712472 +0000 UTC m=+606.158468664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "manager-config" (UniqueName: "kubernetes.io/configmap/1cf8e953-2308-4745-999d-3f136fcb7312-manager-config") pod "lws-controller-manager-6769c56bf6-sq8mf" (UID: "1cf8e953-2308-4745-999d-3f136fcb7312") : failed to sync configmap cache: timed out waiting for the condition Apr 22 19:16:14.618844 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:16:14.618821 2572 projected.go:289] Couldn't get configMap openshift-lws-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Apr 22 19:16:14.618935 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:16:14.618852 2572 projected.go:194] Error preparing data for projected volume kube-api-access-fjswr for pod openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf: failed to sync configmap cache: timed out waiting for the condition Apr 22 19:16:14.618935 ip-10-0-129-110 kubenswrapper[2572]: E0422 19:16:14.618891 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1cf8e953-2308-4745-999d-3f136fcb7312-kube-api-access-fjswr podName:1cf8e953-2308-4745-999d-3f136fcb7312 nodeName:}" failed. No retries permitted until 2026-04-22 19:16:15.118881033 +0000 UTC m=+606.166637217 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fjswr" (UniqueName: "kubernetes.io/projected/1cf8e953-2308-4745-999d-3f136fcb7312-kube-api-access-fjswr") pod "lws-controller-manager-6769c56bf6-sq8mf" (UID: "1cf8e953-2308-4745-999d-3f136fcb7312") : failed to sync configmap cache: timed out waiting for the condition Apr 22 19:16:14.645643 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:14.645621 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 19:16:14.700809 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:14.700791 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 19:16:14.950630 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:14.950568 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 19:16:15.123454 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:15.123425 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjswr\" (UniqueName: \"kubernetes.io/projected/1cf8e953-2308-4745-999d-3f136fcb7312-kube-api-access-fjswr\") pod \"lws-controller-manager-6769c56bf6-sq8mf\" (UID: \"1cf8e953-2308-4745-999d-3f136fcb7312\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:15.123454 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:15.123456 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1cf8e953-2308-4745-999d-3f136fcb7312-manager-config\") pod \"lws-controller-manager-6769c56bf6-sq8mf\" (UID: \"1cf8e953-2308-4745-999d-3f136fcb7312\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:15.123656 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:15.123498 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1cf8e953-2308-4745-999d-3f136fcb7312-cert\") pod \"lws-controller-manager-6769c56bf6-sq8mf\" (UID: \"1cf8e953-2308-4745-999d-3f136fcb7312\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:15.124111 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:15.124091 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1cf8e953-2308-4745-999d-3f136fcb7312-manager-config\") pod \"lws-controller-manager-6769c56bf6-sq8mf\" (UID: \"1cf8e953-2308-4745-999d-3f136fcb7312\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:15.125885 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:15.125864 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1cf8e953-2308-4745-999d-3f136fcb7312-cert\") pod \"lws-controller-manager-6769c56bf6-sq8mf\" (UID: \"1cf8e953-2308-4745-999d-3f136fcb7312\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:15.125978 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:15.125966 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjswr\" (UniqueName: \"kubernetes.io/projected/1cf8e953-2308-4745-999d-3f136fcb7312-kube-api-access-fjswr\") pod \"lws-controller-manager-6769c56bf6-sq8mf\" (UID: \"1cf8e953-2308-4745-999d-3f136fcb7312\") " pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:15.176521 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:15.176486 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:15.301604 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:15.301573 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf"] Apr 22 19:16:15.303704 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:16:15.303676 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cf8e953_2308_4745_999d_3f136fcb7312.slice/crio-54bbdd4e3fbf04a35efaa3eb953c0350307c3cad232c6284b52489bcdde55205 WatchSource:0}: Error finding container 54bbdd4e3fbf04a35efaa3eb953c0350307c3cad232c6284b52489bcdde55205: Status 404 returned error can't find the container with id 54bbdd4e3fbf04a35efaa3eb953c0350307c3cad232c6284b52489bcdde55205 Apr 22 19:16:16.158583 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:16.158535 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" event={"ID":"1cf8e953-2308-4745-999d-3f136fcb7312","Type":"ContainerStarted","Data":"54bbdd4e3fbf04a35efaa3eb953c0350307c3cad232c6284b52489bcdde55205"} Apr 22 19:16:18.165754 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:18.165724 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" event={"ID":"1cf8e953-2308-4745-999d-3f136fcb7312","Type":"ContainerStarted","Data":"cc214fa0bfd979bfe6a5f8786f916fcc7884bfec88a358fd6a58c0541193aed0"} Apr 22 19:16:18.166102 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:18.165839 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:18.188065 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:18.188018 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" podStartSLOduration=2.991957798 podStartE2EDuration="5.188000658s" podCreationTimestamp="2026-04-22 19:16:13 +0000 UTC" firstStartedPulling="2026-04-22 19:16:15.305490887 +0000 UTC m=+606.353247070" lastFinishedPulling="2026-04-22 19:16:17.501533745 +0000 UTC m=+608.549289930" observedRunningTime="2026-04-22 19:16:18.186588741 +0000 UTC m=+609.234344941" watchObservedRunningTime="2026-04-22 19:16:18.188000658 +0000 UTC m=+609.235756864" Apr 22 19:16:21.143474 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:21.143446 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-wkn2c" Apr 22 19:16:25.407925 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:25.407890 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-77597c7855-xzbwh"] Apr 22 19:16:25.411347 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:25.411327 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-77597c7855-xzbwh" Apr 22 19:16:25.414037 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:25.414016 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 22 19:16:25.415080 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:25.415061 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 22 19:16:25.415173 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:25.415065 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-9nv99\"" Apr 22 19:16:25.424349 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:25.424326 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-77597c7855-xzbwh"] Apr 22 19:16:25.515900 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:25.515865 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1822ac4e-02b0-4ba2-b4ea-7053ab90583f-tmp\") pod \"kube-auth-proxy-77597c7855-xzbwh\" (UID: \"1822ac4e-02b0-4ba2-b4ea-7053ab90583f\") " pod="openshift-ingress/kube-auth-proxy-77597c7855-xzbwh" Apr 22 19:16:25.515900 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:25.515905 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1822ac4e-02b0-4ba2-b4ea-7053ab90583f-tls-certs\") pod \"kube-auth-proxy-77597c7855-xzbwh\" (UID: \"1822ac4e-02b0-4ba2-b4ea-7053ab90583f\") " pod="openshift-ingress/kube-auth-proxy-77597c7855-xzbwh" Apr 22 19:16:25.516092 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:25.515945 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb2ln\" (UniqueName: \"kubernetes.io/projected/1822ac4e-02b0-4ba2-b4ea-7053ab90583f-kube-api-access-pb2ln\") pod \"kube-auth-proxy-77597c7855-xzbwh\" (UID: \"1822ac4e-02b0-4ba2-b4ea-7053ab90583f\") " pod="openshift-ingress/kube-auth-proxy-77597c7855-xzbwh" Apr 22 19:16:25.617191 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:25.617157 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb2ln\" (UniqueName: \"kubernetes.io/projected/1822ac4e-02b0-4ba2-b4ea-7053ab90583f-kube-api-access-pb2ln\") pod \"kube-auth-proxy-77597c7855-xzbwh\" (UID: \"1822ac4e-02b0-4ba2-b4ea-7053ab90583f\") " pod="openshift-ingress/kube-auth-proxy-77597c7855-xzbwh" Apr 22 19:16:25.617374 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:25.617282 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1822ac4e-02b0-4ba2-b4ea-7053ab90583f-tmp\") pod \"kube-auth-proxy-77597c7855-xzbwh\" (UID: \"1822ac4e-02b0-4ba2-b4ea-7053ab90583f\") " pod="openshift-ingress/kube-auth-proxy-77597c7855-xzbwh" Apr 22 19:16:25.617374 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:25.617328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1822ac4e-02b0-4ba2-b4ea-7053ab90583f-tls-certs\") pod \"kube-auth-proxy-77597c7855-xzbwh\" (UID: \"1822ac4e-02b0-4ba2-b4ea-7053ab90583f\") " pod="openshift-ingress/kube-auth-proxy-77597c7855-xzbwh" Apr 22 19:16:25.619493 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:25.619462 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1822ac4e-02b0-4ba2-b4ea-7053ab90583f-tmp\") pod \"kube-auth-proxy-77597c7855-xzbwh\" (UID: \"1822ac4e-02b0-4ba2-b4ea-7053ab90583f\") " pod="openshift-ingress/kube-auth-proxy-77597c7855-xzbwh" Apr 22 19:16:25.619679 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:25.619663 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1822ac4e-02b0-4ba2-b4ea-7053ab90583f-tls-certs\") pod \"kube-auth-proxy-77597c7855-xzbwh\" (UID: \"1822ac4e-02b0-4ba2-b4ea-7053ab90583f\") " pod="openshift-ingress/kube-auth-proxy-77597c7855-xzbwh" Apr 22 19:16:25.625576 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:25.625554 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb2ln\" (UniqueName: \"kubernetes.io/projected/1822ac4e-02b0-4ba2-b4ea-7053ab90583f-kube-api-access-pb2ln\") pod \"kube-auth-proxy-77597c7855-xzbwh\" (UID: \"1822ac4e-02b0-4ba2-b4ea-7053ab90583f\") " pod="openshift-ingress/kube-auth-proxy-77597c7855-xzbwh" Apr 22 19:16:25.721183 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:25.721156 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-77597c7855-xzbwh" Apr 22 19:16:25.843664 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:25.843639 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-77597c7855-xzbwh"] Apr 22 19:16:25.846306 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:16:25.846283 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1822ac4e_02b0_4ba2_b4ea_7053ab90583f.slice/crio-60eaf5f9245daab890ee20738962309c308387b06a018fd97efdf912c93d1477 WatchSource:0}: Error finding container 60eaf5f9245daab890ee20738962309c308387b06a018fd97efdf912c93d1477: Status 404 returned error can't find the container with id 60eaf5f9245daab890ee20738962309c308387b06a018fd97efdf912c93d1477 Apr 22 19:16:26.191184 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:26.191099 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-77597c7855-xzbwh" event={"ID":"1822ac4e-02b0-4ba2-b4ea-7053ab90583f","Type":"ContainerStarted","Data":"60eaf5f9245daab890ee20738962309c308387b06a018fd97efdf912c93d1477"} Apr 22 19:16:29.171137 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:29.171061 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6769c56bf6-sq8mf" Apr 22 19:16:30.204814 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:30.204730 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-77597c7855-xzbwh" event={"ID":"1822ac4e-02b0-4ba2-b4ea-7053ab90583f","Type":"ContainerStarted","Data":"39c35753d59ef2c91de6b21c374ded6194cd1e6408b1b1820e939934edc4aa32"} Apr 22 19:16:30.224525 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:16:30.224459 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-77597c7855-xzbwh" podStartSLOduration=1.219611885 podStartE2EDuration="5.2244436s" podCreationTimestamp="2026-04-22 19:16:25 +0000 UTC" firstStartedPulling="2026-04-22 19:16:25.847982781 +0000 UTC m=+616.895738965" lastFinishedPulling="2026-04-22 19:16:29.852814488 +0000 UTC m=+620.900570680" observedRunningTime="2026-04-22 19:16:30.222458942 +0000 UTC m=+621.270215359" watchObservedRunningTime="2026-04-22 19:16:30.2244436 +0000 UTC m=+621.272199850" Apr 22 19:18:19.679660 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:19.679630 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-78xjz"] Apr 22 19:18:19.683018 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:19.683001 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-78xjz" Apr 22 19:18:19.686556 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:19.686535 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 19:18:19.686649 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:19.686565 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 22 19:18:19.687343 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:19.687324 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 22 19:18:19.687667 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:19.687640 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 19:18:19.687781 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:19.687696 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-6vvpl\"" Apr 22 19:18:19.690609 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:19.690588 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-78xjz"] Apr 22 19:18:19.794742 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:19.794718 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf111f7-24f3-4368-9002-738a2456c7fe-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-78xjz\" (UID: \"bcf111f7-24f3-4368-9002-738a2456c7fe\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-78xjz" Apr 22 19:18:19.794863 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:19.794770 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x2z5\" (UniqueName: \"kubernetes.io/projected/bcf111f7-24f3-4368-9002-738a2456c7fe-kube-api-access-7x2z5\") pod \"kuadrant-console-plugin-6cb54b5c86-78xjz\" (UID: \"bcf111f7-24f3-4368-9002-738a2456c7fe\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-78xjz" Apr 22 19:18:19.794863 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:19.794845 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bcf111f7-24f3-4368-9002-738a2456c7fe-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-78xjz\" (UID: \"bcf111f7-24f3-4368-9002-738a2456c7fe\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-78xjz" Apr 22 19:18:19.896122 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:19.896098 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7x2z5\" (UniqueName: \"kubernetes.io/projected/bcf111f7-24f3-4368-9002-738a2456c7fe-kube-api-access-7x2z5\") pod \"kuadrant-console-plugin-6cb54b5c86-78xjz\" (UID: \"bcf111f7-24f3-4368-9002-738a2456c7fe\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-78xjz" Apr 22 19:18:19.896233 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:19.896138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bcf111f7-24f3-4368-9002-738a2456c7fe-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-78xjz\" (UID: \"bcf111f7-24f3-4368-9002-738a2456c7fe\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-78xjz" Apr 22 19:18:19.896233 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:19.896170 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf111f7-24f3-4368-9002-738a2456c7fe-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-78xjz\" (UID: \"bcf111f7-24f3-4368-9002-738a2456c7fe\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-78xjz" Apr 22 19:18:19.896842 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:19.896824 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bcf111f7-24f3-4368-9002-738a2456c7fe-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-78xjz\" (UID: \"bcf111f7-24f3-4368-9002-738a2456c7fe\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-78xjz" Apr 22 19:18:19.898471 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:19.898446 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bcf111f7-24f3-4368-9002-738a2456c7fe-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-78xjz\" (UID: \"bcf111f7-24f3-4368-9002-738a2456c7fe\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-78xjz" Apr 22 19:18:19.904108 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:19.904084 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x2z5\" (UniqueName: \"kubernetes.io/projected/bcf111f7-24f3-4368-9002-738a2456c7fe-kube-api-access-7x2z5\") pod \"kuadrant-console-plugin-6cb54b5c86-78xjz\" (UID: \"bcf111f7-24f3-4368-9002-738a2456c7fe\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-78xjz" Apr 22 19:18:19.993372 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:19.993352 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-78xjz" Apr 22 19:18:20.109971 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:20.109945 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-78xjz"] Apr 22 19:18:20.111999 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:18:20.111970 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcf111f7_24f3_4368_9002_738a2456c7fe.slice/crio-3ceb02ad3298a58ac573d34d4d72c7b6668412026bf617dcb951de2491224f7e WatchSource:0}: Error finding container 3ceb02ad3298a58ac573d34d4d72c7b6668412026bf617dcb951de2491224f7e: Status 404 returned error can't find the container with id 3ceb02ad3298a58ac573d34d4d72c7b6668412026bf617dcb951de2491224f7e Apr 22 19:18:20.552044 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:20.552003 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-78xjz" event={"ID":"bcf111f7-24f3-4368-9002-738a2456c7fe","Type":"ContainerStarted","Data":"3ceb02ad3298a58ac573d34d4d72c7b6668412026bf617dcb951de2491224f7e"} Apr 22 19:18:46.647225 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:46.647137 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-78xjz" event={"ID":"bcf111f7-24f3-4368-9002-738a2456c7fe","Type":"ContainerStarted","Data":"c57645676ee1f1165e13b92c39904e25b3c36981ba2ebbadc8c542a57f6db182"} Apr 22 19:18:46.679405 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:18:46.679361 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-78xjz" podStartSLOduration=1.551921427 podStartE2EDuration="27.679348032s" podCreationTimestamp="2026-04-22 19:18:19 +0000 UTC" firstStartedPulling="2026-04-22 19:18:20.113275749 +0000 UTC m=+731.161031933" lastFinishedPulling="2026-04-22 19:18:46.240702355 +0000 UTC m=+757.288458538" observedRunningTime="2026-04-22 19:18:46.679031945 +0000 UTC m=+757.726788150" watchObservedRunningTime="2026-04-22 19:18:46.679348032 +0000 UTC m=+757.727104238" Apr 22 19:19:03.298908 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:03.298872 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5l5vn"] Apr 22 19:19:03.301270 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:03.301247 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-5l5vn" Apr 22 19:19:03.303921 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:03.303900 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 19:19:03.317353 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:03.317329 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5l5vn"] Apr 22 19:19:03.333637 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:03.333611 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5l5vn"] Apr 22 19:19:03.462881 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:03.462855 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2c120789-ddd5-4dc6-a8a5-6d181797840b-config-file\") pod \"limitador-limitador-78c99df468-5l5vn\" (UID: \"2c120789-ddd5-4dc6-a8a5-6d181797840b\") " pod="kuadrant-system/limitador-limitador-78c99df468-5l5vn" Apr 22 19:19:03.463048 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:03.462914 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfchm\" (UniqueName: \"kubernetes.io/projected/2c120789-ddd5-4dc6-a8a5-6d181797840b-kube-api-access-jfchm\") pod \"limitador-limitador-78c99df468-5l5vn\" (UID: \"2c120789-ddd5-4dc6-a8a5-6d181797840b\") " pod="kuadrant-system/limitador-limitador-78c99df468-5l5vn" Apr 22 19:19:03.563779 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:03.563705 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfchm\" (UniqueName: \"kubernetes.io/projected/2c120789-ddd5-4dc6-a8a5-6d181797840b-kube-api-access-jfchm\") pod \"limitador-limitador-78c99df468-5l5vn\" (UID: \"2c120789-ddd5-4dc6-a8a5-6d181797840b\") " pod="kuadrant-system/limitador-limitador-78c99df468-5l5vn" Apr 22 19:19:03.563901 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:03.563788 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2c120789-ddd5-4dc6-a8a5-6d181797840b-config-file\") pod \"limitador-limitador-78c99df468-5l5vn\" (UID: \"2c120789-ddd5-4dc6-a8a5-6d181797840b\") " pod="kuadrant-system/limitador-limitador-78c99df468-5l5vn" Apr 22 19:19:03.564319 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:03.564302 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2c120789-ddd5-4dc6-a8a5-6d181797840b-config-file\") pod \"limitador-limitador-78c99df468-5l5vn\" (UID: \"2c120789-ddd5-4dc6-a8a5-6d181797840b\") " pod="kuadrant-system/limitador-limitador-78c99df468-5l5vn" Apr 22 19:19:03.575868 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:03.575840 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfchm\" (UniqueName: \"kubernetes.io/projected/2c120789-ddd5-4dc6-a8a5-6d181797840b-kube-api-access-jfchm\") pod \"limitador-limitador-78c99df468-5l5vn\" (UID: \"2c120789-ddd5-4dc6-a8a5-6d181797840b\") " pod="kuadrant-system/limitador-limitador-78c99df468-5l5vn" Apr 22 19:19:03.611849 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:03.611823 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-5l5vn" Apr 22 19:19:03.731769 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:03.731735 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5l5vn"] Apr 22 19:19:03.733787 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:19:03.733759 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c120789_ddd5_4dc6_a8a5_6d181797840b.slice/crio-d966e798e142604b0ae3cbc5e082e6fc443daf7109bf362c80c4eee4ce30273d WatchSource:0}: Error finding container d966e798e142604b0ae3cbc5e082e6fc443daf7109bf362c80c4eee4ce30273d: Status 404 returned error can't find the container with id d966e798e142604b0ae3cbc5e082e6fc443daf7109bf362c80c4eee4ce30273d Apr 22 19:19:04.705953 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:04.705913 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-5l5vn" event={"ID":"2c120789-ddd5-4dc6-a8a5-6d181797840b","Type":"ContainerStarted","Data":"d966e798e142604b0ae3cbc5e082e6fc443daf7109bf362c80c4eee4ce30273d"} Apr 22 19:19:06.715224 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:06.715192 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-5l5vn" event={"ID":"2c120789-ddd5-4dc6-a8a5-6d181797840b","Type":"ContainerStarted","Data":"532b296a6f6cf64a9fccdd27414eea9b620ddb00b5d73f18294a568bbcd66881"} Apr 22 19:19:06.715604 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:06.715325 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-5l5vn" Apr 22 19:19:06.733964 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:06.733921 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-5l5vn" podStartSLOduration=1.383838823 podStartE2EDuration="3.733908091s" podCreationTimestamp="2026-04-22 19:19:03 +0000 UTC" firstStartedPulling="2026-04-22 19:19:03.735642574 +0000 UTC m=+774.783398758" lastFinishedPulling="2026-04-22 19:19:06.085711842 +0000 UTC m=+777.133468026" observedRunningTime="2026-04-22 19:19:06.732296119 +0000 UTC m=+777.780052326" watchObservedRunningTime="2026-04-22 19:19:06.733908091 +0000 UTC m=+777.781664297" Apr 22 19:19:17.719613 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:17.719579 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-5l5vn" Apr 22 19:19:40.427012 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:19:40.426922 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5l5vn"] Apr 22 19:20:15.224888 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:20:15.224851 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5l5vn"] Apr 22 19:20:18.314591 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:20:18.314559 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5l5vn"] Apr 22 19:20:27.049464 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:20:27.049436 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5l5vn"] Apr 22 19:20:34.117773 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:20:34.117738 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5l5vn"] Apr 22 19:20:39.631533 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:20:39.631488 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5l5vn"] Apr 22 19:20:43.904483 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:20:43.904443 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5l5vn"] Apr 22 19:21:09.677486 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:21:09.677450 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-s8877_9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4/cluster-monitoring-operator/0.log" Apr 22 19:21:09.678814 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:21:09.678789 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-s8877_9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4/cluster-monitoring-operator/0.log" Apr 22 19:23:47.121994 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:47.121959 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6c9fd8c974-wkn2c_d186bd35-b3de-4042-a859-79a792233992/manager/0.log" Apr 22 19:23:48.966081 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:48.966055 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-78xjz_bcf111f7-24f3-4368-9002-738a2456c7fe/kuadrant-console-plugin/0.log" Apr 22 19:23:49.287016 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:49.286928 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-5l5vn_2c120789-ddd5-4dc6-a8a5-6d181797840b/limitador/0.log" Apr 22 19:23:50.053260 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:50.053226 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-77597c7855-xzbwh_1822ac4e-02b0-4ba2-b4ea-7053ab90583f/kube-auth-proxy/0.log" Apr 22 19:23:50.256652 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:50.256599 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-fc6bc5684-m58jn_763efc39-e846-49c8-8d1f-df055b7efff8/router/0.log" Apr 22 19:23:55.270039 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:55.270000 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tdl99/must-gather-vr92w"] Apr 22 19:23:55.272708 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:55.272686 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tdl99/must-gather-vr92w" Apr 22 19:23:55.275027 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:55.275001 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tdl99\"/\"openshift-service-ca.crt\"" Apr 22 19:23:55.275111 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:55.275059 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tdl99\"/\"default-dockercfg-ncqxp\"" Apr 22 19:23:55.276064 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:55.276039 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tdl99\"/\"kube-root-ca.crt\"" Apr 22 19:23:55.290291 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:55.290270 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tdl99/must-gather-vr92w"] Apr 22 19:23:55.347940 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:55.347914 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5q26\" (UniqueName: \"kubernetes.io/projected/d33c6add-4dbc-4497-b233-29ed0b466101-kube-api-access-f5q26\") pod \"must-gather-vr92w\" (UID: \"d33c6add-4dbc-4497-b233-29ed0b466101\") " pod="openshift-must-gather-tdl99/must-gather-vr92w" Apr 22 19:23:55.348045 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:55.347975 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d33c6add-4dbc-4497-b233-29ed0b466101-must-gather-output\") pod \"must-gather-vr92w\" (UID: \"d33c6add-4dbc-4497-b233-29ed0b466101\") " pod="openshift-must-gather-tdl99/must-gather-vr92w" Apr 22 19:23:55.448673 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:55.448650 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5q26\" (UniqueName: \"kubernetes.io/projected/d33c6add-4dbc-4497-b233-29ed0b466101-kube-api-access-f5q26\") pod \"must-gather-vr92w\" (UID: \"d33c6add-4dbc-4497-b233-29ed0b466101\") " pod="openshift-must-gather-tdl99/must-gather-vr92w" Apr 22 19:23:55.448797 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:55.448752 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d33c6add-4dbc-4497-b233-29ed0b466101-must-gather-output\") pod \"must-gather-vr92w\" (UID: \"d33c6add-4dbc-4497-b233-29ed0b466101\") " pod="openshift-must-gather-tdl99/must-gather-vr92w" Apr 22 19:23:55.449112 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:55.449094 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d33c6add-4dbc-4497-b233-29ed0b466101-must-gather-output\") pod \"must-gather-vr92w\" (UID: \"d33c6add-4dbc-4497-b233-29ed0b466101\") " pod="openshift-must-gather-tdl99/must-gather-vr92w" Apr 22 19:23:55.456282 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:55.456259 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5q26\" (UniqueName: \"kubernetes.io/projected/d33c6add-4dbc-4497-b233-29ed0b466101-kube-api-access-f5q26\") pod \"must-gather-vr92w\" (UID: \"d33c6add-4dbc-4497-b233-29ed0b466101\") " pod="openshift-must-gather-tdl99/must-gather-vr92w" Apr 22 19:23:55.581608 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:55.581536 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tdl99/must-gather-vr92w" Apr 22 19:23:55.907715 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:55.907689 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tdl99/must-gather-vr92w"] Apr 22 19:23:55.910186 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:23:55.910158 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd33c6add_4dbc_4497_b233_29ed0b466101.slice/crio-f9422ffc8ce50b0368282e5fd6db78c3d36de34e3faac15888b514d42282046c WatchSource:0}: Error finding container f9422ffc8ce50b0368282e5fd6db78c3d36de34e3faac15888b514d42282046c: Status 404 returned error can't find the container with id f9422ffc8ce50b0368282e5fd6db78c3d36de34e3faac15888b514d42282046c Apr 22 19:23:55.911734 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:55.911718 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:23:56.653676 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:56.653639 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tdl99/must-gather-vr92w" event={"ID":"d33c6add-4dbc-4497-b233-29ed0b466101","Type":"ContainerStarted","Data":"f9422ffc8ce50b0368282e5fd6db78c3d36de34e3faac15888b514d42282046c"} Apr 22 19:23:57.659987 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:57.659843 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tdl99/must-gather-vr92w" event={"ID":"d33c6add-4dbc-4497-b233-29ed0b466101","Type":"ContainerStarted","Data":"5312411990dfd6be1b5cb2ec1480812983a5662345bcbc596754348b29e22343"} Apr 22 19:23:57.659987 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:57.659888 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tdl99/must-gather-vr92w" event={"ID":"d33c6add-4dbc-4497-b233-29ed0b466101","Type":"ContainerStarted","Data":"f276867fa69a47e1da3f30a76d7f0c6e7a1d7e4add034d4c5f5d1d0b0b8f412b"} Apr 22 19:23:57.677832 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:57.677775 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tdl99/must-gather-vr92w" podStartSLOduration=1.862517402 podStartE2EDuration="2.677755932s" podCreationTimestamp="2026-04-22 19:23:55 +0000 UTC" firstStartedPulling="2026-04-22 19:23:55.911876692 +0000 UTC m=+1066.959632877" lastFinishedPulling="2026-04-22 19:23:56.727115217 +0000 UTC m=+1067.774871407" observedRunningTime="2026-04-22 19:23:57.674615039 +0000 UTC m=+1068.722371242" watchObservedRunningTime="2026-04-22 19:23:57.677755932 +0000 UTC m=+1068.725512141" Apr 22 19:23:58.251986 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:58.251957 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-bz72b_e0ff6be1-282c-4ce7-bd85-3383ced78c15/global-pull-secret-syncer/0.log" Apr 22 19:23:58.335965 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:58.335935 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-jr9sz_da0eed59-92c9-4593-b2f3-3ec47cc8c911/konnectivity-agent/0.log" Apr 22 19:23:58.411844 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:23:58.411801 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-110.ec2.internal_54e4518df75bf5ebe24281d874521911/haproxy/0.log" Apr 22 19:24:02.768648 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:02.768614 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-78xjz_bcf111f7-24f3-4368-9002-738a2456c7fe/kuadrant-console-plugin/0.log" Apr 22 19:24:02.865993 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:02.865878 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-5l5vn_2c120789-ddd5-4dc6-a8a5-6d181797840b/limitador/0.log" Apr 22 19:24:04.456762 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:04.456732 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-s8877_9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4/cluster-monitoring-operator/1.log" Apr 22 19:24:04.549465 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:04.549378 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-s8877_9d9eb13c-7ebf-45ab-8aa0-ee5b0aaf46d4/cluster-monitoring-operator/0.log" Apr 22 19:24:04.575221 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:04.575186 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-w9kv6_8d77a754-770f-4df8-a80a-4341eb422dea/kube-state-metrics/0.log" Apr 22 19:24:04.600292 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:04.600262 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-w9kv6_8d77a754-770f-4df8-a80a-4341eb422dea/kube-rbac-proxy-main/0.log" Apr 22 19:24:04.630488 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:04.630461 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-w9kv6_8d77a754-770f-4df8-a80a-4341eb422dea/kube-rbac-proxy-self/0.log" Apr 22 19:24:04.656717 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:04.656687 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5bd878c756-8sl5r_a2c1bf90-21ac-417c-a652-f49544683d3e/metrics-server/0.log" Apr 22 19:24:04.683687 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:04.683631 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-g69c6_58128424-86cf-418f-b63e-97891f8faf56/monitoring-plugin/0.log" Apr 22 19:24:04.738215 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:04.738167 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9g586_1e090539-33ef-4cd3-9340-65a4afb737cd/node-exporter/0.log" Apr 22 19:24:04.761031 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:04.760994 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9g586_1e090539-33ef-4cd3-9340-65a4afb737cd/kube-rbac-proxy/0.log" Apr 22 19:24:04.785289 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:04.785261 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9g586_1e090539-33ef-4cd3-9340-65a4afb737cd/init-textfile/0.log" Apr 22 19:24:05.052406 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:05.052379 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a79105b9-b3c3-4585-83ca-77f18f82fd03/prometheus/0.log" Apr 22 19:24:05.076619 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:05.076578 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a79105b9-b3c3-4585-83ca-77f18f82fd03/config-reloader/0.log" Apr 22 19:24:05.099469 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:05.099446 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a79105b9-b3c3-4585-83ca-77f18f82fd03/thanos-sidecar/0.log" Apr 22 19:24:05.124457 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:05.124433 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a79105b9-b3c3-4585-83ca-77f18f82fd03/kube-rbac-proxy-web/0.log" Apr 22 19:24:05.154491 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:05.154467 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a79105b9-b3c3-4585-83ca-77f18f82fd03/kube-rbac-proxy/0.log" Apr 22 19:24:05.180844 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:05.180810 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a79105b9-b3c3-4585-83ca-77f18f82fd03/kube-rbac-proxy-thanos/0.log" Apr 22 19:24:05.202970 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:05.202943 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a79105b9-b3c3-4585-83ca-77f18f82fd03/init-config-reloader/0.log" Apr 22 19:24:05.308542 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:05.308450 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-79794fcb7c-7xbcx_adea0732-ba1b-4d0d-b6a2-49a6e98f453f/telemeter-client/0.log" Apr 22 19:24:05.330623 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:05.330592 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-79794fcb7c-7xbcx_adea0732-ba1b-4d0d-b6a2-49a6e98f453f/reload/0.log" Apr 22 19:24:05.359709 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:05.359672 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-79794fcb7c-7xbcx_adea0732-ba1b-4d0d-b6a2-49a6e98f453f/kube-rbac-proxy/0.log" Apr 22 19:24:06.440653 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.440619 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd"] Apr 22 19:24:06.447130 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.447105 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:06.450205 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.450122 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd"] Apr 22 19:24:06.537222 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.537197 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-sx5qr_b9dd00bb-c13a-4382-a816-dc9639b9e184/networking-console-plugin/0.log" Apr 22 19:24:06.566759 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.566726 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/03c20994-60b7-4444-9259-3272162ea054-sys\") pod \"perf-node-gather-daemonset-xbjcd\" (UID: \"03c20994-60b7-4444-9259-3272162ea054\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:06.566929 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.566797 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/03c20994-60b7-4444-9259-3272162ea054-podres\") pod \"perf-node-gather-daemonset-xbjcd\" (UID: \"03c20994-60b7-4444-9259-3272162ea054\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:06.566929 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.566840 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7npgh\" (UniqueName: \"kubernetes.io/projected/03c20994-60b7-4444-9259-3272162ea054-kube-api-access-7npgh\") pod \"perf-node-gather-daemonset-xbjcd\" (UID: \"03c20994-60b7-4444-9259-3272162ea054\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:06.566929 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.566873 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/03c20994-60b7-4444-9259-3272162ea054-proc\") pod \"perf-node-gather-daemonset-xbjcd\" (UID: \"03c20994-60b7-4444-9259-3272162ea054\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:06.567073 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.567052 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/03c20994-60b7-4444-9259-3272162ea054-lib-modules\") pod \"perf-node-gather-daemonset-xbjcd\" (UID: \"03c20994-60b7-4444-9259-3272162ea054\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:06.668159 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.668116 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7npgh\" (UniqueName: \"kubernetes.io/projected/03c20994-60b7-4444-9259-3272162ea054-kube-api-access-7npgh\") pod \"perf-node-gather-daemonset-xbjcd\" (UID: \"03c20994-60b7-4444-9259-3272162ea054\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:06.668315 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.668200 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/03c20994-60b7-4444-9259-3272162ea054-proc\") pod \"perf-node-gather-daemonset-xbjcd\" (UID: \"03c20994-60b7-4444-9259-3272162ea054\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:06.668315 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.668266 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/03c20994-60b7-4444-9259-3272162ea054-lib-modules\") pod \"perf-node-gather-daemonset-xbjcd\" (UID: \"03c20994-60b7-4444-9259-3272162ea054\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:06.668391 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.668311 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/03c20994-60b7-4444-9259-3272162ea054-proc\") pod \"perf-node-gather-daemonset-xbjcd\" (UID: \"03c20994-60b7-4444-9259-3272162ea054\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:06.668443 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.668419 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/03c20994-60b7-4444-9259-3272162ea054-sys\") pod \"perf-node-gather-daemonset-xbjcd\" (UID: \"03c20994-60b7-4444-9259-3272162ea054\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:06.668479 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.668439 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/03c20994-60b7-4444-9259-3272162ea054-lib-modules\") pod \"perf-node-gather-daemonset-xbjcd\" (UID: \"03c20994-60b7-4444-9259-3272162ea054\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:06.668479 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.668459 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/03c20994-60b7-4444-9259-3272162ea054-podres\") pod \"perf-node-gather-daemonset-xbjcd\" (UID: \"03c20994-60b7-4444-9259-3272162ea054\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:06.668563 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.668476 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/03c20994-60b7-4444-9259-3272162ea054-sys\") pod \"perf-node-gather-daemonset-xbjcd\" (UID: \"03c20994-60b7-4444-9259-3272162ea054\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:06.668596 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.668565 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/03c20994-60b7-4444-9259-3272162ea054-podres\") pod \"perf-node-gather-daemonset-xbjcd\" (UID: \"03c20994-60b7-4444-9259-3272162ea054\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:06.676457 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.676431 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7npgh\" (UniqueName: \"kubernetes.io/projected/03c20994-60b7-4444-9259-3272162ea054-kube-api-access-7npgh\") pod \"perf-node-gather-daemonset-xbjcd\" (UID: \"03c20994-60b7-4444-9259-3272162ea054\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:06.764403 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.764373 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:06.920449 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:06.920395 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd"] Apr 22 19:24:06.924133 ip-10-0-129-110 kubenswrapper[2572]: W0422 19:24:06.924099 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod03c20994_60b7_4444_9259_3272162ea054.slice/crio-18d151bb7e17daeb2f4f8f398022e8c23ce9f68dfb1e63858c03cc2b1256d70d WatchSource:0}: Error finding container 18d151bb7e17daeb2f4f8f398022e8c23ce9f68dfb1e63858c03cc2b1256d70d: Status 404 returned error can't find the container with id 18d151bb7e17daeb2f4f8f398022e8c23ce9f68dfb1e63858c03cc2b1256d70d Apr 22 19:24:07.719166 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:07.719134 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" event={"ID":"03c20994-60b7-4444-9259-3272162ea054","Type":"ContainerStarted","Data":"559772c744dfc52a6eca60c70cdc19b1e3aae8d1ef5affe15edbe03a94692bd1"} Apr 22 19:24:07.719547 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:07.719172 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" event={"ID":"03c20994-60b7-4444-9259-3272162ea054","Type":"ContainerStarted","Data":"18d151bb7e17daeb2f4f8f398022e8c23ce9f68dfb1e63858c03cc2b1256d70d"} Apr 22 19:24:07.719547 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:07.719260 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:07.736415 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:07.736374 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" podStartSLOduration=1.736359722 podStartE2EDuration="1.736359722s" podCreationTimestamp="2026-04-22 19:24:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:24:07.734652417 +0000 UTC m=+1078.782408628" watchObservedRunningTime="2026-04-22 19:24:07.736359722 +0000 UTC m=+1078.784115921" Apr 22 19:24:08.028775 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:08.028698 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-ldws8_94ebe646-1062-42b2-ba8b-a73a0b60e0f6/volume-data-source-validator/0.log" Apr 22 19:24:08.892023 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:08.891996 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-s5lsg_20d45b6d-4197-46bf-bb48-01fcb92e75d7/dns/0.log" Apr 22 19:24:08.911227 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:08.911202 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-s5lsg_20d45b6d-4197-46bf-bb48-01fcb92e75d7/kube-rbac-proxy/0.log" Apr 22 19:24:08.979653 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:08.979627 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jhwl6_c470388d-c98e-482f-9c89-a240f3abac2d/dns-node-resolver/0.log" Apr 22 19:24:09.484304 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:09.484275 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tbt7s_6756b0b3-8e30-47c8-925b-478ee2126fcc/node-ca/0.log" Apr 22 19:24:10.369460 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:10.369433 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-77597c7855-xzbwh_1822ac4e-02b0-4ba2-b4ea-7053ab90583f/kube-auth-proxy/0.log" Apr 22 19:24:10.425634 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:10.425608 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-fc6bc5684-m58jn_763efc39-e846-49c8-8d1f-df055b7efff8/router/0.log" Apr 22 19:24:10.937858 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:10.937830 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4mp4v_f941c8a3-428c-47d0-a796-fd116d4256dc/serve-healthcheck-canary/0.log" Apr 22 19:24:11.629599 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:11.629571 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rmvq2_44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5/kube-rbac-proxy/0.log" Apr 22 19:24:11.671082 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:11.671057 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rmvq2_44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5/exporter/0.log" Apr 22 19:24:11.708629 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:11.708598 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rmvq2_44f2c1d7-31d3-4edc-83b7-d8b759b0e4d5/extractor/0.log" Apr 22 19:24:13.608941 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:13.608903 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6c9fd8c974-wkn2c_d186bd35-b3de-4042-a859-79a792233992/manager/0.log" Apr 22 19:24:13.736308 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:13.736264 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-xbjcd" Apr 22 19:24:14.823067 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:14.823019 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6769c56bf6-sq8mf_1cf8e953-2308-4745-999d-3f136fcb7312/manager/0.log" Apr 22 19:24:19.149738 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:19.149707 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-sf8zr_17159697-aff9-46c8-9fbc-9155d0485df2/migrator/0.log" Apr 22 19:24:19.170171 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:19.170130 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-sf8zr_17159697-aff9-46c8-9fbc-9155d0485df2/graceful-termination/0.log" Apr 22 19:24:20.454466 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:20.454437 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ghkl8_a1a4c2e8-7247-484d-9439-dc4d46888d9b/kube-multus-additional-cni-plugins/0.log" Apr 22 19:24:20.478000 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:20.477969 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ghkl8_a1a4c2e8-7247-484d-9439-dc4d46888d9b/egress-router-binary-copy/0.log" Apr 22 19:24:20.499274 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:20.499248 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ghkl8_a1a4c2e8-7247-484d-9439-dc4d46888d9b/cni-plugins/0.log" Apr 22 19:24:20.521050 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:20.521030 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ghkl8_a1a4c2e8-7247-484d-9439-dc4d46888d9b/bond-cni-plugin/0.log" Apr 22 19:24:20.544999 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:20.544979 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ghkl8_a1a4c2e8-7247-484d-9439-dc4d46888d9b/routeoverride-cni/0.log" Apr 22 19:24:20.567649 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:20.567627 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ghkl8_a1a4c2e8-7247-484d-9439-dc4d46888d9b/whereabouts-cni-bincopy/0.log" Apr 22 19:24:20.589118 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:20.589088 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ghkl8_a1a4c2e8-7247-484d-9439-dc4d46888d9b/whereabouts-cni/0.log" Apr 22 19:24:20.975088 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:20.975057 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p7t9t_db7cc212-874b-4767-a85f-3393efedb1fa/kube-multus/0.log" Apr 22 19:24:21.103388 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:21.103363 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hvrqj_627cf532-d693-4215-85b9-807d744857ce/network-metrics-daemon/0.log" Apr 22 19:24:21.122315 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:21.122294 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hvrqj_627cf532-d693-4215-85b9-807d744857ce/kube-rbac-proxy/0.log" Apr 22 19:24:22.483448 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:22.483407 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v4fm9_40723a08-baf4-4bac-8032-9853f6f1a2e2/ovn-controller/0.log" Apr 22 19:24:22.507148 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:22.507126 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v4fm9_40723a08-baf4-4bac-8032-9853f6f1a2e2/ovn-acl-logging/0.log" Apr 22 19:24:22.526397 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:22.526375 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v4fm9_40723a08-baf4-4bac-8032-9853f6f1a2e2/kube-rbac-proxy-node/0.log" Apr 22 19:24:22.547945 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:22.547923 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v4fm9_40723a08-baf4-4bac-8032-9853f6f1a2e2/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:24:22.566850 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:22.566833 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v4fm9_40723a08-baf4-4bac-8032-9853f6f1a2e2/northd/0.log" Apr 22 19:24:22.586174 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:22.586157 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v4fm9_40723a08-baf4-4bac-8032-9853f6f1a2e2/nbdb/0.log" Apr 22 19:24:22.611793 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:22.611776 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v4fm9_40723a08-baf4-4bac-8032-9853f6f1a2e2/sbdb/0.log" Apr 22 19:24:22.720344 ip-10-0-129-110 kubenswrapper[2572]: I0422 19:24:22.720314 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v4fm9_40723a08-baf4-4bac-8032-9853f6f1a2e2/ovnkube-controller/0.log"