Apr 16 19:53:57.040559 ip-10-0-130-116 systemd[1]: Starting Kubernetes Kubelet... Apr 16 19:53:57.507211 ip-10-0-130-116 kubenswrapper[2561]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:57.507211 ip-10-0-130-116 kubenswrapper[2561]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 19:53:57.507211 ip-10-0-130-116 kubenswrapper[2561]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:57.507211 ip-10-0-130-116 kubenswrapper[2561]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 19:53:57.507211 ip-10-0-130-116 kubenswrapper[2561]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:57.508233 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.508095 2561 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 19:53:57.513475 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513460 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:57.513475 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513475 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513479 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513483 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513486 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513489 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513506 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513509 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513512 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513514 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513517 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513520 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513523 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513525 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513528 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513531 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513534 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513536 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513539 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513542 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513544 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:57.513540 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513547 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513550 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513553 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513556 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513559 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513561 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513564 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513567 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513569 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513572 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513574 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513577 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513579 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513584 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513587 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513590 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513593 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513595 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513598 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513601 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:57.514009 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513603 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513607 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513609 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513612 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513614 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513617 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513619 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513622 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513624 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513627 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513629 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513632 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513634 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513637 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513640 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513643 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513646 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513648 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513651 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513654 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:57.514521 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513656 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513659 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513661 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513664 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513666 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513669 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513671 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513674 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513677 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513679 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513682 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513685 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513688 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513690 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513693 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513696 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513698 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513700 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513704 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513710 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:57.515004 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513713 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513716 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513718 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513721 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.513723 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514106 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514111 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514114 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514117 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514120 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514122 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514125 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514128 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514131 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514134 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514136 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514139 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514141 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514143 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514146 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:57.515509 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514149 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514151 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514153 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514157 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514159 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514162 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514164 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514167 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514169 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514172 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514174 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514176 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514179 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514181 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514184 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514187 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514189 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514191 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514195 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:57.516132 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514199 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514202 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514206 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514209 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514211 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514214 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514216 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514219 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514221 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514224 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514226 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514229 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514231 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514234 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514236 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514238 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514242 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514260 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514263 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514266 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:57.516829 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514269 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514271 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514274 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514276 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514279 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514282 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514287 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514290 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514293 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514296 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514298 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514302 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514305 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514308 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514311 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514313 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514316 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514318 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514321 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:57.517352 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514324 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514326 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514329 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514331 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514334 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514336 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514339 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514341 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514344 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514346 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514349 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514352 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.514355 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515123 2561 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515132 2561 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515138 2561 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515142 2561 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515147 2561 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515150 2561 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515155 2561 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515160 2561 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 19:53:57.517860 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515163 2561 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515166 2561 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515170 2561 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515175 2561 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515178 2561 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515181 2561 flags.go:64] FLAG: --cgroup-root="" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515184 2561 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515188 2561 flags.go:64] FLAG: --client-ca-file="" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515191 2561 flags.go:64] FLAG: --cloud-config="" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515194 2561 flags.go:64] FLAG: --cloud-provider="external" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515197 2561 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515202 2561 flags.go:64] FLAG: --cluster-domain="" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515205 2561 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515208 2561 flags.go:64] FLAG: --config-dir="" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515211 2561 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515214 2561 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515218 2561 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515221 2561 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515225 2561 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515228 2561 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515231 2561 flags.go:64] FLAG: --contention-profiling="false" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515234 2561 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515237 2561 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515241 2561 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515245 2561 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 19:53:57.518393 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515262 2561 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515265 2561 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515268 2561 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515271 2561 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515275 2561 flags.go:64] FLAG: --enable-server="true" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515278 2561 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515282 2561 flags.go:64] FLAG: --event-burst="100" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515285 2561 flags.go:64] FLAG: --event-qps="50" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515288 2561 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515291 2561 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515294 2561 flags.go:64] FLAG: --eviction-hard="" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515298 2561 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515301 2561 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515305 2561 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515308 2561 flags.go:64] FLAG: --eviction-soft="" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515311 2561 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515314 2561 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515317 2561 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515320 2561 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515323 2561 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515326 2561 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515329 2561 flags.go:64] FLAG: --feature-gates="" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515333 2561 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515336 2561 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515339 2561 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 19:53:57.519130 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515343 2561 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515346 2561 flags.go:64] FLAG: --healthz-port="10248" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515349 2561 flags.go:64] FLAG: --help="false" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515352 2561 flags.go:64] FLAG: --hostname-override="ip-10-0-130-116.ec2.internal" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515355 2561 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515358 2561 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515362 2561 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515365 2561 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515369 2561 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515372 2561 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515375 2561 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515378 2561 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515381 2561 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515384 2561 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515387 2561 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515390 2561 flags.go:64] FLAG: --kube-reserved="" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515393 2561 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515396 2561 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515399 2561 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515402 2561 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515405 2561 flags.go:64] FLAG: --lock-file="" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515408 2561 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515411 2561 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515414 2561 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 19:53:57.519763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515419 2561 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515422 2561 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515425 2561 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515428 2561 flags.go:64] FLAG: --logging-format="text" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515431 2561 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515435 2561 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515438 2561 flags.go:64] FLAG: --manifest-url="" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515441 2561 flags.go:64] FLAG: --manifest-url-header="" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515445 2561 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515448 2561 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515452 2561 flags.go:64] FLAG: --max-pods="110" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515455 2561 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515462 2561 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515466 2561 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515469 2561 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515472 2561 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515476 2561 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515479 2561 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515487 2561 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515490 2561 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515493 2561 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515496 2561 flags.go:64] FLAG: --pod-cidr="" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515499 2561 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 19:53:57.520359 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515505 2561 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515507 2561 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515511 2561 flags.go:64] FLAG: --pods-per-core="0" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515514 2561 flags.go:64] FLAG: --port="10250" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515517 2561 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515521 2561 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06444ac65c27367a2" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515524 2561 flags.go:64] FLAG: --qos-reserved="" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515527 2561 flags.go:64] FLAG: --read-only-port="10255" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515530 2561 flags.go:64] FLAG: --register-node="true" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515533 2561 flags.go:64] FLAG: --register-schedulable="true" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515536 2561 flags.go:64] FLAG: --register-with-taints="" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515542 2561 flags.go:64] FLAG: --registry-burst="10" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515545 2561 flags.go:64] FLAG: --registry-qps="5" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515548 2561 flags.go:64] FLAG: --reserved-cpus="" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515551 2561 flags.go:64] FLAG: --reserved-memory="" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515555 2561 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515558 2561 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515561 2561 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515564 2561 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515567 2561 flags.go:64] FLAG: --runonce="false" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515570 2561 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515573 2561 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515579 2561 flags.go:64] FLAG: --seccomp-default="false" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515582 2561 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515585 2561 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515591 2561 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 19:53:57.520966 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515594 2561 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515597 2561 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515600 2561 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515603 2561 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515606 2561 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515609 2561 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515612 2561 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515615 2561 flags.go:64] FLAG: --system-cgroups="" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515618 2561 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515624 2561 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515627 2561 flags.go:64] FLAG: --tls-cert-file="" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515630 2561 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515634 2561 flags.go:64] FLAG: --tls-min-version="" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515637 2561 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515640 2561 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515643 2561 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515646 2561 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515649 2561 flags.go:64] FLAG: --v="2" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515653 2561 flags.go:64] FLAG: --version="false" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515657 2561 flags.go:64] FLAG: --vmodule="" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515661 2561 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.515665 2561 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516130 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516143 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:57.521614 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516148 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516153 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516159 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516168 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516173 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516178 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516183 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516189 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516194 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516198 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516203 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516207 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516211 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516215 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516220 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516229 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516234 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516238 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516243 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516261 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:57.522175 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516266 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516270 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516274 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516279 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516283 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516287 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516291 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516296 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516305 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516309 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516314 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516318 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516322 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516327 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516331 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516336 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516341 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516345 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516349 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516354 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:57.522727 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516363 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516368 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516373 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516377 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516382 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516388 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516393 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516398 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516402 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516407 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516412 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516416 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516426 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516430 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516434 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516439 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516443 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516447 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516451 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:57.523226 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516455 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516459 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516463 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516467 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516472 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516476 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516485 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516489 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516494 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516498 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516504 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516509 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516513 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516517 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516522 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516537 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516541 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516546 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516550 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:57.523724 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516559 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:57.524230 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516563 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:57.524230 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516567 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:57.524230 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516571 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:57.524230 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516575 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:57.524230 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.516579 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:57.524230 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.517479 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:57.524230 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.524003 2561 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 19:53:57.524230 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.524021 2561 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 19:53:57.524230 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524068 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:57.524230 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524073 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:57.524230 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524077 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:57.524230 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524080 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:57.524230 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524083 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:57.524230 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524086 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:57.524230 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524088 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:57.524230 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524091 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524094 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524096 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524099 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524101 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524104 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524107 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524109 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524112 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524114 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524117 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524120 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524122 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524126 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524128 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524131 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524134 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524136 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524139 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:57.524660 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524142 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524144 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524147 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524149 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524152 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524155 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524158 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524161 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524163 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524166 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524168 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524170 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524173 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524176 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524178 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524181 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524183 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524186 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524188 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524191 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:57.525117 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524194 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524196 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524199 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524201 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524203 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524206 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524208 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524211 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524214 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524216 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524219 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524221 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524224 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524226 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524229 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524232 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524235 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524238 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524241 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:57.525618 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524260 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524266 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524269 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524272 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524276 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524279 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524282 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524285 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524288 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524290 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524293 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524296 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524299 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524301 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524304 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524306 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524309 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524311 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524314 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:57.526081 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524316 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:57.526557 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524319 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:57.526557 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.524324 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:57.526557 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524415 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:57.526557 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524419 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:57.526557 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524422 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:57.526557 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524425 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:57.526557 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524428 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:57.526557 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524430 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:57.526557 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524433 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:57.526557 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524435 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:57.526557 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524438 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:57.526557 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524440 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:57.526557 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524443 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:57.526557 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524447 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:57.526557 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524449 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:57.526557 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524452 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524454 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524457 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524460 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524462 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524465 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524468 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524470 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524472 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524475 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524478 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524482 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524485 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524488 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524490 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524493 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524495 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524498 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524500 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:57.526955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524503 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524505 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524508 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524511 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524513 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524515 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524518 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524520 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524523 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524525 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524528 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524530 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524534 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524537 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524539 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524542 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524546 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524549 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524552 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:57.527455 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524556 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524558 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524561 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524563 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524566 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524568 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524571 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524573 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524575 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524578 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524581 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524583 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524586 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524588 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524591 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524593 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524596 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524598 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524600 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524603 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:57.527916 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524605 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:57.528418 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524608 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:57.528418 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524610 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:57.528418 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524613 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:57.528418 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524615 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:57.528418 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524617 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:57.528418 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524620 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:57.528418 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524623 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:57.528418 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524625 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:57.528418 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524628 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:57.528418 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524630 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:57.528418 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524633 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:57.528418 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524635 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:57.528418 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524645 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:57.528418 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:57.524649 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:57.528418 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.524653 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:57.528828 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.525470 2561 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 19:53:57.528828 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.528602 2561 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 19:53:57.529492 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.529481 2561 server.go:1019] "Starting client certificate rotation" Apr 16 19:53:57.529594 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.529577 2561 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:57.529634 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.529613 2561 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:57.558289 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.558271 2561 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:57.561340 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.561317 2561 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:57.576004 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.575983 2561 log.go:25] "Validated CRI v1 runtime API" Apr 16 19:53:57.581603 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.581587 2561 log.go:25] "Validated CRI v1 image API" Apr 16 19:53:57.582867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.582854 2561 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 19:53:57.587810 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.587782 2561 fs.go:135] Filesystem UUIDs: map[0114f2a4-a42e-480e-8b51-42c66f4e6ce3:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 adce221c-ab6a-4992-8418-84f011db69d7:/dev/nvme0n1p4] Apr 16 19:53:57.587810 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.587808 2561 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 19:53:57.595433 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.595413 2561 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:57.596165 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.596056 2561 manager.go:217] Machine: {Timestamp:2026-04-16 19:53:57.594174596 +0000 UTC m=+0.433979988 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3120098 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24f9d7a115172ee265cbf4b05bab61 SystemUUID:ec24f9d7-a115-172e-e265-cbf4b05bab61 BootID:3d1e9fea-4e8d-4221-88b2-8f5f6fe2f84a Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:68:99:45:9f:7f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:68:99:45:9f:7f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:be:59:59:00:a5:1a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 19:53:57.596165 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.596160 2561 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 19:53:57.596288 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.596239 2561 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 19:53:57.597300 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.597279 2561 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 19:53:57.597426 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.597303 2561 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-116.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 19:53:57.597476 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.597436 2561 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 19:53:57.597476 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.597445 2561 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 19:53:57.597476 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.597458 2561 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:57.598687 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.598676 2561 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:57.599453 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.599443 2561 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:57.599558 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.599549 2561 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 19:53:57.602035 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.602016 2561 kubelet.go:491] "Attempting to sync node with API server" Apr 16 19:53:57.602035 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.602039 2561 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 19:53:57.602119 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.602056 2561 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 19:53:57.602119 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.602065 2561 kubelet.go:397] "Adding apiserver pod source" Apr 16 19:53:57.602119 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.602073 2561 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 19:53:57.603292 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.603281 2561 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:57.603347 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.603301 2561 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:57.621049 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.616146 2561 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kgslj" Apr 16 19:53:57.622043 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.622029 2561 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 19:53:57.623332 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.623319 2561 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 19:53:57.624366 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.624351 2561 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kgslj" Apr 16 19:53:57.625205 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.625193 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 19:53:57.625315 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.625210 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 19:53:57.625315 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.625217 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 19:53:57.625315 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.625223 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 19:53:57.625315 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.625230 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 19:53:57.625315 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.625235 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 19:53:57.625315 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.625241 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 19:53:57.625315 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.625260 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 19:53:57.625315 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.625268 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 19:53:57.625315 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.625276 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 19:53:57.625315 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.625288 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 19:53:57.625315 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.625303 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 19:53:57.625677 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:57.625532 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-116.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 19:53:57.625677 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:57.625558 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 19:53:57.626281 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.626268 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 19:53:57.626321 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.626283 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 19:53:57.629954 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.629942 2561 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 19:53:57.630007 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.629981 2561 server.go:1295] "Started kubelet" Apr 16 19:53:57.630152 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.630100 2561 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 19:53:57.630203 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.630152 2561 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 19:53:57.630262 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.630236 2561 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 19:53:57.630711 ip-10-0-130-116 systemd[1]: Started Kubernetes Kubelet. Apr 16 19:53:57.632027 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.631877 2561 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 19:53:57.632573 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.632550 2561 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-116.ec2.internal" not found Apr 16 19:53:57.633185 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.633168 2561 server.go:317] "Adding debug handlers to kubelet server" Apr 16 19:53:57.637072 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.637059 2561 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 19:53:57.637140 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.637077 2561 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:57.637817 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.637799 2561 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 19:53:57.637909 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.637811 2561 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 19:53:57.637909 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.637851 2561 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 19:53:57.637909 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.637862 2561 factory.go:55] Registering systemd factory Apr 16 19:53:57.637909 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.637884 2561 factory.go:223] Registration of the systemd container factory successfully Apr 16 19:53:57.638094 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:57.638017 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-116.ec2.internal\" not found" Apr 16 19:53:57.638157 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.638106 2561 reconstruct.go:97] "Volume reconstruction finished" Apr 16 19:53:57.638157 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.638126 2561 factory.go:153] Registering CRI-O factory Apr 16 19:53:57.638157 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.638133 2561 reconciler.go:26] "Reconciler: start to sync state" Apr 16 19:53:57.638157 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.638143 2561 factory.go:223] Registration of the crio container factory successfully Apr 16 19:53:57.638314 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.638217 2561 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 19:53:57.638314 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.638241 2561 factory.go:103] Registering Raw factory Apr 16 19:53:57.638314 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.638270 2561 manager.go:1196] Started watching for new ooms in manager Apr 16 19:53:57.638644 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.638635 2561 manager.go:319] Starting recovery of all containers Apr 16 19:53:57.639131 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.639103 2561 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:57.640082 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:57.639473 2561 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 19:53:57.641861 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:57.641823 2561 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-116.ec2.internal\" not found" node="ip-10-0-130-116.ec2.internal" Apr 16 19:53:57.648616 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.648505 2561 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-116.ec2.internal" not found Apr 16 19:53:57.651192 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.651176 2561 manager.go:324] Recovery completed Apr 16 19:53:57.655333 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.655321 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:57.657488 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.657474 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:57.657546 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.657502 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:57.657546 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.657515 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:57.657996 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.657984 2561 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 19:53:57.658036 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.657995 2561 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 19:53:57.658036 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.658014 2561 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:57.660355 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.660343 2561 policy_none.go:49] "None policy: Start" Apr 16 19:53:57.660411 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.660363 2561 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 19:53:57.660411 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.660374 2561 state_mem.go:35] "Initializing new in-memory state store" Apr 16 19:53:57.694828 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.694813 2561 manager.go:341] "Starting Device Plugin manager" Apr 16 19:53:57.705449 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:57.694847 2561 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 19:53:57.705449 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.694857 2561 server.go:85] "Starting device plugin registration server" Apr 16 19:53:57.705449 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.695058 2561 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 19:53:57.705449 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.695070 2561 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 19:53:57.705449 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.695172 2561 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 19:53:57.705449 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.695304 2561 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 19:53:57.705449 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.695313 2561 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 19:53:57.705449 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:57.695786 2561 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 19:53:57.705449 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:57.695822 2561 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-116.ec2.internal\" not found" Apr 16 19:53:57.710522 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.710508 2561 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-116.ec2.internal" not found Apr 16 19:53:57.795894 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.795840 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:57.796812 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.796796 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:57.796890 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.796825 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:57.796890 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.796835 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:57.796890 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.796861 2561 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-116.ec2.internal" Apr 16 19:53:57.802769 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.802743 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 19:53:57.803995 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.803979 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 19:53:57.804057 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.804019 2561 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 19:53:57.804094 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.804071 2561 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 19:53:57.804094 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.804082 2561 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 19:53:57.804175 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:57.804150 2561 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 19:53:57.805198 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.805185 2561 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-116.ec2.internal" Apr 16 19:53:57.805285 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:57.805203 2561 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-116.ec2.internal\": node \"ip-10-0-130-116.ec2.internal\" not found" Apr 16 19:53:57.807298 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.807277 2561 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:57.845190 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:57.845168 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-116.ec2.internal\" not found" Apr 16 19:53:57.904227 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.904194 2561 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-116.ec2.internal"] Apr 16 19:53:57.904301 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.904289 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:57.905153 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.905134 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:57.905244 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.905175 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:57.905244 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.905190 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:57.906571 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.906558 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:57.906713 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.906700 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal" Apr 16 19:53:57.906751 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.906729 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:57.907244 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.907228 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:57.907323 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.907228 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:57.907323 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.907280 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:57.907323 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.907292 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:57.907323 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.907296 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:57.907323 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.907312 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:57.908613 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.908598 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-116.ec2.internal" Apr 16 19:53:57.908721 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.908627 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:57.909226 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.909212 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:57.909323 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.909235 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:57.909323 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.909264 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:57.930794 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:57.930777 2561 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-116.ec2.internal\" not found" node="ip-10-0-130-116.ec2.internal" Apr 16 19:53:57.935392 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:57.935373 2561 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-116.ec2.internal\" not found" node="ip-10-0-130-116.ec2.internal" Apr 16 19:53:57.940683 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.940667 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c6ec743fe0f5f66c8cc32b87339cf525-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal\" (UID: \"c6ec743fe0f5f66c8cc32b87339cf525\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal" Apr 16 19:53:57.940733 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.940692 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6ec743fe0f5f66c8cc32b87339cf525-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal\" (UID: \"c6ec743fe0f5f66c8cc32b87339cf525\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal" Apr 16 19:53:57.940733 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:57.940708 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0eb38fade52ea6b8de006afe3c20412c-config\") pod \"kube-apiserver-proxy-ip-10-0-130-116.ec2.internal\" (UID: \"0eb38fade52ea6b8de006afe3c20412c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-116.ec2.internal" Apr 16 19:53:57.945916 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:57.945899 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-116.ec2.internal\" not found" Apr 16 19:53:58.041820 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.041793 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6ec743fe0f5f66c8cc32b87339cf525-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal\" (UID: \"c6ec743fe0f5f66c8cc32b87339cf525\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal" Apr 16 19:53:58.041937 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.041824 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0eb38fade52ea6b8de006afe3c20412c-config\") pod \"kube-apiserver-proxy-ip-10-0-130-116.ec2.internal\" (UID: \"0eb38fade52ea6b8de006afe3c20412c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-116.ec2.internal" Apr 16 19:53:58.041937 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.041848 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c6ec743fe0f5f66c8cc32b87339cf525-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal\" (UID: \"c6ec743fe0f5f66c8cc32b87339cf525\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal" Apr 16 19:53:58.041937 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.041876 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0eb38fade52ea6b8de006afe3c20412c-config\") pod \"kube-apiserver-proxy-ip-10-0-130-116.ec2.internal\" (UID: \"0eb38fade52ea6b8de006afe3c20412c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-116.ec2.internal" Apr 16 19:53:58.041937 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.041897 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6ec743fe0f5f66c8cc32b87339cf525-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal\" (UID: \"c6ec743fe0f5f66c8cc32b87339cf525\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal" Apr 16 19:53:58.041937 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.041919 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c6ec743fe0f5f66c8cc32b87339cf525-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal\" (UID: \"c6ec743fe0f5f66c8cc32b87339cf525\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal" Apr 16 19:53:58.046180 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:58.046136 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-116.ec2.internal\" not found" Apr 16 19:53:58.146606 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:58.146582 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-116.ec2.internal\" not found" Apr 16 19:53:58.232840 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.232810 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal" Apr 16 19:53:58.238402 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.238382 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-116.ec2.internal" Apr 16 19:53:58.247128 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:58.247111 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-116.ec2.internal\" not found" Apr 16 19:53:58.347694 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:58.347621 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-116.ec2.internal\" not found" Apr 16 19:53:58.448177 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:58.448138 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-116.ec2.internal\" not found" Apr 16 19:53:58.529353 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.529323 2561 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 19:53:58.529795 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.529485 2561 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:53:58.529795 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.529505 2561 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:53:58.548900 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:58.548875 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-116.ec2.internal\" not found" Apr 16 19:53:58.553166 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.553150 2561 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:58.625978 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.625913 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 19:48:57 +0000 UTC" deadline="2027-09-16 23:03:06.656525501 +0000 UTC" Apr 16 19:53:58.625978 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.625940 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12435h9m8.030588384s" Apr 16 19:53:58.638083 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.638060 2561 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:58.649268 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:58.649236 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-116.ec2.internal\" not found" Apr 16 19:53:58.650560 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.650543 2561 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:58.671834 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.671811 2561 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-54smv" Apr 16 19:53:58.678361 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.678345 2561 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-54smv" Apr 16 19:53:58.750383 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:58.750349 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-116.ec2.internal\" not found" Apr 16 19:53:58.774878 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:58.774842 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eb38fade52ea6b8de006afe3c20412c.slice/crio-27499dc177eb2e3b49bee92c650e7a9a82251c943759fc4fa8bcfe2b85de343b WatchSource:0}: Error finding container 27499dc177eb2e3b49bee92c650e7a9a82251c943759fc4fa8bcfe2b85de343b: Status 404 returned error can't find the container with id 27499dc177eb2e3b49bee92c650e7a9a82251c943759fc4fa8bcfe2b85de343b Apr 16 19:53:58.775201 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:53:58.775181 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6ec743fe0f5f66c8cc32b87339cf525.slice/crio-fcaea935281df0e6b1531f98a88e6ed21dbc424672435961fd1df59c9b6beac2 WatchSource:0}: Error finding container fcaea935281df0e6b1531f98a88e6ed21dbc424672435961fd1df59c9b6beac2: Status 404 returned error can't find the container with id fcaea935281df0e6b1531f98a88e6ed21dbc424672435961fd1df59c9b6beac2 Apr 16 19:53:58.780041 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.780026 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:53:58.806576 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.806529 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal" event={"ID":"c6ec743fe0f5f66c8cc32b87339cf525","Type":"ContainerStarted","Data":"fcaea935281df0e6b1531f98a88e6ed21dbc424672435961fd1df59c9b6beac2"} Apr 16 19:53:58.807434 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.807407 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-116.ec2.internal" event={"ID":"0eb38fade52ea6b8de006afe3c20412c","Type":"ContainerStarted","Data":"27499dc177eb2e3b49bee92c650e7a9a82251c943759fc4fa8bcfe2b85de343b"} Apr 16 19:53:58.830747 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.830139 2561 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:58.837531 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.837513 2561 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal" Apr 16 19:53:58.850182 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.850164 2561 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:53:58.851170 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.851149 2561 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-116.ec2.internal" Apr 16 19:53:58.859659 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:58.859640 2561 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:53:59.597877 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.597843 2561 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:59.603336 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.603272 2561 apiserver.go:52] "Watching apiserver" Apr 16 19:53:59.613640 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.613615 2561 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 19:53:59.615133 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.615104 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-w96qg","openshift-network-diagnostics/network-check-target-qq6lx","openshift-ovn-kubernetes/ovnkube-node-2px7v","kube-system/konnectivity-agent-67lnf","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h","openshift-cluster-node-tuning-operator/tuned-d2hvw","openshift-multus/multus-4s7gc","openshift-multus/network-metrics-daemon-vx6n5","openshift-network-operator/iptables-alerter-z5st5","kube-system/kube-apiserver-proxy-ip-10-0-130-116.ec2.internal","openshift-image-registry/node-ca-kzf5d","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal"] Apr 16 19:53:59.617800 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.617778 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.618888 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.618846 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:53:59.619005 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:59.618923 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qq6lx" podUID="10a9f71c-fe60-4983-820b-1a8007ba1863" Apr 16 19:53:59.619317 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.619296 2561 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:59.620675 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.620647 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 19:53:59.620675 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.620686 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.620903 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.620832 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 19:53:59.621281 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.621143 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dc4zp\"" Apr 16 19:53:59.621281 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.621183 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 19:53:59.621281 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.621198 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 19:53:59.621281 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.621211 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 19:53:59.621922 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.621858 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-67lnf" Apr 16 19:53:59.622980 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.622939 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 19:53:59.623103 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.623047 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 19:53:59.623799 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.623777 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-d5svt\"" Apr 16 19:53:59.624747 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.624728 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 19:53:59.624835 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.624755 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 19:53:59.624835 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.624800 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 19:53:59.625028 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.625016 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 19:53:59.625105 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.625083 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 19:53:59.625185 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.625123 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.625311 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.625284 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.625574 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.625556 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 19:53:59.625700 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.625581 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zhggq\"" Apr 16 19:53:59.627097 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.626783 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.627651 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.627632 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 19:53:59.627763 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.627659 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 19:53:59.627901 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.627847 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 19:53:59.627976 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.627901 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-89t4j\"" Apr 16 19:53:59.630108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.630085 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:53:59.630206 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:59.630164 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vx6n5" podUID="5b25a2cc-2c1d-4c3d-93ff-f73223624d78" Apr 16 19:53:59.630300 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.630231 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-z5st5" Apr 16 19:53:59.632999 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.631933 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kzf5d" Apr 16 19:53:59.633700 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.633679 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-r72cj\"" Apr 16 19:53:59.633805 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.633788 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:53:59.633909 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.633892 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 19:53:59.634020 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.634003 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 19:53:59.634136 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.634119 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9tprm\"" Apr 16 19:53:59.634203 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.634134 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gklx6\"" Apr 16 19:53:59.635456 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.635431 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 19:53:59.635544 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.635432 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:53:59.635544 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.635520 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xsjdh\"" Apr 16 19:53:59.635638 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.635577 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 19:53:59.635684 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.635647 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 19:53:59.635768 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.635755 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 19:53:59.635823 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.635810 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 19:53:59.638917 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.638897 2561 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 19:53:59.650424 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650400 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.650551 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650438 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d12d0814-9c34-4bb1-b975-721e0ecd4752-cni-binary-copy\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.650551 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650462 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2b2bea17-ec01-4d33-b497-41bb92b91043-agent-certs\") pod \"konnectivity-agent-67lnf\" (UID: \"2b2bea17-ec01-4d33-b497-41bb92b91043\") " pod="kube-system/konnectivity-agent-67lnf" Apr 16 19:53:59.650551 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650486 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-multus-conf-dir\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.650551 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650513 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6rjw\" (UniqueName: \"kubernetes.io/projected/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-kube-api-access-p6rjw\") pod \"network-metrics-daemon-vx6n5\" (UID: \"5b25a2cc-2c1d-4c3d-93ff-f73223624d78\") " pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:53:59.650551 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650534 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-sysconfig\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.650551 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650551 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tgbf\" (UniqueName: \"kubernetes.io/projected/c76d029a-2a7e-46c6-a287-817cea8544c3-kube-api-access-5tgbf\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.650853 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650565 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5cd7f671-aebd-4643-9419-5f6ae541d400-registration-dir\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.650853 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650581 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5cd7f671-aebd-4643-9419-5f6ae541d400-etc-selinux\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.650853 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650595 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bfrd\" (UniqueName: \"kubernetes.io/projected/5cd7f671-aebd-4643-9419-5f6ae541d400-kube-api-access-2bfrd\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.650853 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650616 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-host-var-lib-cni-bin\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.650853 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650640 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-host\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.650853 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650654 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-system-cni-dir\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.650853 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650667 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-os-release\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.650853 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650693 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs\") pod \"network-metrics-daemon-vx6n5\" (UID: \"5b25a2cc-2c1d-4c3d-93ff-f73223624d78\") " pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:53:59.650853 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650740 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2b2bea17-ec01-4d33-b497-41bb92b91043-konnectivity-ca\") pod \"konnectivity-agent-67lnf\" (UID: \"2b2bea17-ec01-4d33-b497-41bb92b91043\") " pod="kube-system/konnectivity-agent-67lnf" Apr 16 19:53:59.650853 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650765 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cd7f671-aebd-4643-9419-5f6ae541d400-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.650853 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650853 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d12d0814-9c34-4bb1-b975-721e0ecd4752-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.651343 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650880 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-kubelet\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.651343 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650955 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-systemd-units\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.651343 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.650995 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-run-ovn\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.651343 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651019 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-log-socket\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.651343 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651080 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-run-ovn-kubernetes\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.651343 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651105 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fbcj\" (UniqueName: \"kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj\") pod \"network-check-target-qq6lx\" (UID: \"10a9f71c-fe60-4983-820b-1a8007ba1863\") " pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:53:59.651343 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651225 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc49t\" (UniqueName: \"kubernetes.io/projected/d12d0814-9c34-4bb1-b975-721e0ecd4752-kube-api-access-sc49t\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.651343 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651273 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-run-systemd\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.651343 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651300 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-sysctl-conf\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.651802 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651396 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c76d029a-2a7e-46c6-a287-817cea8544c3-tmp\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.651802 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651441 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-multus-socket-dir-parent\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.651802 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651505 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-multus-daemon-config\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.651802 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651531 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e71bfafb-1a55-495e-afc2-a4ffd47dedea-ovn-node-metrics-cert\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.651802 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651574 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-cni-binary-copy\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.651802 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651599 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-host-var-lib-kubelet\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.651802 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651666 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d12d0814-9c34-4bb1-b975-721e0ecd4752-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.651802 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651748 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-run\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.652191 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651772 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-lib-modules\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.652191 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651851 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-var-lib-kubelet\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.652191 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651894 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55-serviceca\") pod \"node-ca-kzf5d\" (UID: \"f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55\") " pod="openshift-image-registry/node-ca-kzf5d" Apr 16 19:53:59.652191 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651953 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-multus-cni-dir\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.652191 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.651977 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-host-run-k8s-cni-cncf-io\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.652191 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652028 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7kw7\" (UniqueName: \"kubernetes.io/projected/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-kube-api-access-v7kw7\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.652191 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652052 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-cni-netd\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.652191 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652104 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d12d0814-9c34-4bb1-b975-721e0ecd4752-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.652191 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652128 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-cni-bin\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.652191 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652168 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-tuned\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.652741 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652213 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-node-log\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.652741 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652263 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-run-openvswitch\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.652741 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652302 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-modprobe-d\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.652741 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652320 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-kubernetes\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.652741 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652335 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-systemd\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.652741 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652349 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d12d0814-9c34-4bb1-b975-721e0ecd4752-os-release\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.652741 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652365 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-slash\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.652741 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652380 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/95692ec0-fe71-4218-8b62-1622b9caabaa-iptables-alerter-script\") pod \"iptables-alerter-z5st5\" (UID: \"95692ec0-fe71-4218-8b62-1622b9caabaa\") " pod="openshift-network-operator/iptables-alerter-z5st5" Apr 16 19:53:59.652741 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652395 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffm5j\" (UniqueName: \"kubernetes.io/projected/f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55-kube-api-access-ffm5j\") pod \"node-ca-kzf5d\" (UID: \"f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55\") " pod="openshift-image-registry/node-ca-kzf5d" Apr 16 19:53:59.652741 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652409 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5cd7f671-aebd-4643-9419-5f6ae541d400-sys-fs\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.652741 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652423 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5cd7f671-aebd-4643-9419-5f6ae541d400-socket-dir\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.652741 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652436 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e71bfafb-1a55-495e-afc2-a4ffd47dedea-ovnkube-config\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.652741 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652451 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-run-netns\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.652741 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652464 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-sys\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.652741 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652509 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/95692ec0-fe71-4218-8b62-1622b9caabaa-host-slash\") pod \"iptables-alerter-z5st5\" (UID: \"95692ec0-fe71-4218-8b62-1622b9caabaa\") " pod="openshift-network-operator/iptables-alerter-z5st5" Apr 16 19:53:59.652741 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652542 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-hostroot\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.653492 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652568 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-host-run-multus-certs\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.653492 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652590 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndfp4\" (UniqueName: \"kubernetes.io/projected/95692ec0-fe71-4218-8b62-1622b9caabaa-kube-api-access-ndfp4\") pod \"iptables-alerter-z5st5\" (UID: \"95692ec0-fe71-4218-8b62-1622b9caabaa\") " pod="openshift-network-operator/iptables-alerter-z5st5" Apr 16 19:53:59.653492 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652609 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d12d0814-9c34-4bb1-b975-721e0ecd4752-system-cni-dir\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.653492 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652631 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-sysctl-d\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.653492 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652660 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5cd7f671-aebd-4643-9419-5f6ae541d400-device-dir\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.653492 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652698 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d12d0814-9c34-4bb1-b975-721e0ecd4752-cnibin\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.653492 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652725 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-var-lib-openvswitch\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.653492 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652751 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55-host\") pod \"node-ca-kzf5d\" (UID: \"f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55\") " pod="openshift-image-registry/node-ca-kzf5d" Apr 16 19:53:59.653492 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652775 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-etc-openvswitch\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.653492 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652799 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e71bfafb-1a55-495e-afc2-a4ffd47dedea-ovnkube-script-lib\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.653492 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652820 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-host-run-netns\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.653492 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652842 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-etc-kubernetes\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.653492 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652863 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e71bfafb-1a55-495e-afc2-a4ffd47dedea-env-overrides\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.653492 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652888 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc7pj\" (UniqueName: \"kubernetes.io/projected/e71bfafb-1a55-495e-afc2-a4ffd47dedea-kube-api-access-mc7pj\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.653492 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652908 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-cnibin\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.653492 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.652923 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-host-var-lib-cni-multus\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.679589 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.679559 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:48:58 +0000 UTC" deadline="2027-09-30 16:19:39.925347756 +0000 UTC" Apr 16 19:53:59.679694 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.679589 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12764h25m40.245762386s" Apr 16 19:53:59.753356 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753318 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-run-systemd\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.753356 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753351 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-sysctl-conf\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.753575 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753374 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c76d029a-2a7e-46c6-a287-817cea8544c3-tmp\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.753575 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753397 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-multus-socket-dir-parent\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.753575 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753415 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-multus-daemon-config\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.753575 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753441 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e71bfafb-1a55-495e-afc2-a4ffd47dedea-ovn-node-metrics-cert\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.753575 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753443 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-run-systemd\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.753575 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753466 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-cni-binary-copy\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.753575 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753491 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-host-var-lib-kubelet\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.753575 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753515 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d12d0814-9c34-4bb1-b975-721e0ecd4752-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.753575 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753537 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-run\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.753575 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753543 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-sysctl-conf\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.753575 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753559 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-lib-modules\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.754050 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753586 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-var-lib-kubelet\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.754050 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753610 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55-serviceca\") pod \"node-ca-kzf5d\" (UID: \"f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55\") " pod="openshift-image-registry/node-ca-kzf5d" Apr 16 19:53:59.754050 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753635 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-multus-cni-dir\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.754050 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753658 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-host-run-k8s-cni-cncf-io\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.754050 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753681 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7kw7\" (UniqueName: \"kubernetes.io/projected/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-kube-api-access-v7kw7\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.754050 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753707 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-cni-netd\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.754050 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753732 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d12d0814-9c34-4bb1-b975-721e0ecd4752-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.754050 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753747 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-cni-bin\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.754050 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753764 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-tuned\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.754050 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753780 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-node-log\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.754050 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753769 2561 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 19:53:59.754050 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753798 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-multus-socket-dir-parent\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.754050 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753809 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-run-openvswitch\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.754050 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753837 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-modprobe-d\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.754050 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753862 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-kubernetes\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.754050 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753886 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-systemd\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.754050 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753890 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-multus-cni-dir\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.754050 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753912 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d12d0814-9c34-4bb1-b975-721e0ecd4752-os-release\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.754867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753935 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-lib-modules\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.754867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753943 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-host-var-lib-kubelet\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.754867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753938 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-slash\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.754867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753982 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-slash\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.754867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753985 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/95692ec0-fe71-4218-8b62-1622b9caabaa-iptables-alerter-script\") pod \"iptables-alerter-z5st5\" (UID: \"95692ec0-fe71-4218-8b62-1622b9caabaa\") " pod="openshift-network-operator/iptables-alerter-z5st5" Apr 16 19:53:59.754867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.753994 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-var-lib-kubelet\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.754867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754046 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffm5j\" (UniqueName: \"kubernetes.io/projected/f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55-kube-api-access-ffm5j\") pod \"node-ca-kzf5d\" (UID: \"f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55\") " pod="openshift-image-registry/node-ca-kzf5d" Apr 16 19:53:59.754867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754052 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-multus-daemon-config\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.754867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754066 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-run\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.754867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754074 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5cd7f671-aebd-4643-9419-5f6ae541d400-sys-fs\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.754867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754099 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5cd7f671-aebd-4643-9419-5f6ae541d400-socket-dir\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.754867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754119 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-node-log\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.754867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754127 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e71bfafb-1a55-495e-afc2-a4ffd47dedea-ovnkube-config\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.754867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754153 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-run-netns\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.754867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754163 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-cni-netd\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.754867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754176 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-sys\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.754867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754209 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-host-run-k8s-cni-cncf-io\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.754867 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754221 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/95692ec0-fe71-4218-8b62-1622b9caabaa-host-slash\") pod \"iptables-alerter-z5st5\" (UID: \"95692ec0-fe71-4218-8b62-1622b9caabaa\") " pod="openshift-network-operator/iptables-alerter-z5st5" Apr 16 19:53:59.755734 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754261 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-hostroot\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.755734 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754284 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d12d0814-9c34-4bb1-b975-721e0ecd4752-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.755734 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754327 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-host-run-multus-certs\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.755734 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754337 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-run-openvswitch\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.755734 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754287 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-host-run-multus-certs\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.755734 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754375 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-cni-bin\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.755734 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754383 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndfp4\" (UniqueName: \"kubernetes.io/projected/95692ec0-fe71-4218-8b62-1622b9caabaa-kube-api-access-ndfp4\") pod \"iptables-alerter-z5st5\" (UID: \"95692ec0-fe71-4218-8b62-1622b9caabaa\") " pod="openshift-network-operator/iptables-alerter-z5st5" Apr 16 19:53:59.755734 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754395 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5cd7f671-aebd-4643-9419-5f6ae541d400-sys-fs\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.755734 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754410 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d12d0814-9c34-4bb1-b975-721e0ecd4752-system-cni-dir\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.755734 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754453 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-sysctl-d\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.755734 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754477 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5cd7f671-aebd-4643-9419-5f6ae541d400-device-dir\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.755734 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754510 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-modprobe-d\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.755734 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754503 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d12d0814-9c34-4bb1-b975-721e0ecd4752-cnibin\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.755734 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754564 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-kubernetes\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.755734 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754565 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d12d0814-9c34-4bb1-b975-721e0ecd4752-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.755734 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754570 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-var-lib-openvswitch\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.755734 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754573 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/95692ec0-fe71-4218-8b62-1622b9caabaa-iptables-alerter-script\") pod \"iptables-alerter-z5st5\" (UID: \"95692ec0-fe71-4218-8b62-1622b9caabaa\") " pod="openshift-network-operator/iptables-alerter-z5st5" Apr 16 19:53:59.756548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754614 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-systemd\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.756548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754619 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-var-lib-openvswitch\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.756548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754639 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55-host\") pod \"node-ca-kzf5d\" (UID: \"f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55\") " pod="openshift-image-registry/node-ca-kzf5d" Apr 16 19:53:59.756548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754669 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-etc-openvswitch\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.756548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754678 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5cd7f671-aebd-4643-9419-5f6ae541d400-device-dir\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.756548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754691 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-sysctl-d\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.756548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754718 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d12d0814-9c34-4bb1-b975-721e0ecd4752-cnibin\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.756548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754761 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-sys\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.756548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754798 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55-host\") pod \"node-ca-kzf5d\" (UID: \"f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55\") " pod="openshift-image-registry/node-ca-kzf5d" Apr 16 19:53:59.756548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754801 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5cd7f671-aebd-4643-9419-5f6ae541d400-socket-dir\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.756548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754859 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-run-netns\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.756548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754899 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-etc-openvswitch\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.756548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754953 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-hostroot\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.756548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754971 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55-serviceca\") pod \"node-ca-kzf5d\" (UID: \"f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55\") " pod="openshift-image-registry/node-ca-kzf5d" Apr 16 19:53:59.756548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754972 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d12d0814-9c34-4bb1-b975-721e0ecd4752-system-cni-dir\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.756548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754696 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e71bfafb-1a55-495e-afc2-a4ffd47dedea-ovnkube-script-lib\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.756548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755000 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/95692ec0-fe71-4218-8b62-1622b9caabaa-host-slash\") pod \"iptables-alerter-z5st5\" (UID: \"95692ec0-fe71-4218-8b62-1622b9caabaa\") " pod="openshift-network-operator/iptables-alerter-z5st5" Apr 16 19:53:59.756548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755018 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-host-run-netns\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.758108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755027 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d12d0814-9c34-4bb1-b975-721e0ecd4752-os-release\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.758108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.754338 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-cni-binary-copy\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.758108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755044 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-etc-kubernetes\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.758108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755069 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-host-run-netns\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.758108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755092 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-etc-kubernetes\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.758108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755091 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e71bfafb-1a55-495e-afc2-a4ffd47dedea-env-overrides\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.758108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755664 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mc7pj\" (UniqueName: \"kubernetes.io/projected/e71bfafb-1a55-495e-afc2-a4ffd47dedea-kube-api-access-mc7pj\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.758108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755691 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-cnibin\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.758108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755715 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-host-var-lib-cni-multus\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.758108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755741 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.758108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755771 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d12d0814-9c34-4bb1-b975-721e0ecd4752-cni-binary-copy\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.758108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755788 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e71bfafb-1a55-495e-afc2-a4ffd47dedea-ovnkube-script-lib\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.758108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755795 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2b2bea17-ec01-4d33-b497-41bb92b91043-agent-certs\") pod \"konnectivity-agent-67lnf\" (UID: \"2b2bea17-ec01-4d33-b497-41bb92b91043\") " pod="kube-system/konnectivity-agent-67lnf" Apr 16 19:53:59.758108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755846 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-multus-conf-dir\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.758108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755877 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6rjw\" (UniqueName: \"kubernetes.io/projected/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-kube-api-access-p6rjw\") pod \"network-metrics-daemon-vx6n5\" (UID: \"5b25a2cc-2c1d-4c3d-93ff-f73223624d78\") " pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:53:59.758108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755902 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-sysconfig\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.758108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755927 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tgbf\" (UniqueName: \"kubernetes.io/projected/c76d029a-2a7e-46c6-a287-817cea8544c3-kube-api-access-5tgbf\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.758958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.755977 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5cd7f671-aebd-4643-9419-5f6ae541d400-registration-dir\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.758958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756002 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5cd7f671-aebd-4643-9419-5f6ae541d400-etc-selinux\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.758958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756027 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bfrd\" (UniqueName: \"kubernetes.io/projected/5cd7f671-aebd-4643-9419-5f6ae541d400-kube-api-access-2bfrd\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.758958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756048 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-host-var-lib-cni-multus\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.758958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756100 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-multus-conf-dir\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.758958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756107 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-host-var-lib-cni-bin\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.758958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756154 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-cnibin\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.758958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756158 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5cd7f671-aebd-4643-9419-5f6ae541d400-registration-dir\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.758958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756056 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-host-var-lib-cni-bin\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.758958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756195 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-host\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.758958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756221 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-system-cni-dir\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.758958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756264 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-os-release\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.758958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756265 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.758958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756290 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs\") pod \"network-metrics-daemon-vx6n5\" (UID: \"5b25a2cc-2c1d-4c3d-93ff-f73223624d78\") " pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:53:59.758958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756317 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2b2bea17-ec01-4d33-b497-41bb92b91043-konnectivity-ca\") pod \"konnectivity-agent-67lnf\" (UID: \"2b2bea17-ec01-4d33-b497-41bb92b91043\") " pod="kube-system/konnectivity-agent-67lnf" Apr 16 19:53:59.758958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756330 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-host\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.758958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756341 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cd7f671-aebd-4643-9419-5f6ae541d400-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.759747 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756366 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-sysconfig\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.759747 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756367 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d12d0814-9c34-4bb1-b975-721e0ecd4752-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.759747 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756411 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5cd7f671-aebd-4643-9419-5f6ae541d400-etc-selinux\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.759747 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756418 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-kubelet\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.759747 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756450 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-systemd-units\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.759747 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756475 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-run-ovn\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.759747 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756499 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-log-socket\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.759747 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756529 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-run-ovn-kubernetes\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.759747 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756555 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fbcj\" (UniqueName: \"kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj\") pod \"network-check-target-qq6lx\" (UID: \"10a9f71c-fe60-4983-820b-1a8007ba1863\") " pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:53:59.759747 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756580 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sc49t\" (UniqueName: \"kubernetes.io/projected/d12d0814-9c34-4bb1-b975-721e0ecd4752-kube-api-access-sc49t\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.759747 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756648 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-systemd-units\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.759747 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756789 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-kubelet\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.759747 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.756826 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d12d0814-9c34-4bb1-b975-721e0ecd4752-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.759747 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:59.756873 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:59.759747 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:59.756988 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs podName:5b25a2cc-2c1d-4c3d-93ff-f73223624d78 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:00.256959536 +0000 UTC m=+3.096764942 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs") pod "network-metrics-daemon-vx6n5" (UID: "5b25a2cc-2c1d-4c3d-93ff-f73223624d78") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:59.759747 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.757068 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e71bfafb-1a55-495e-afc2-a4ffd47dedea-ovnkube-config\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.759747 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.757103 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-system-cni-dir\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.760563 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.757161 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cd7f671-aebd-4643-9419-5f6ae541d400-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.760563 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.757178 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-os-release\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.760563 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.757198 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-log-socket\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.760563 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.757217 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-host-run-ovn-kubernetes\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.760563 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.757259 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e71bfafb-1a55-495e-afc2-a4ffd47dedea-run-ovn\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.760563 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.757361 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2b2bea17-ec01-4d33-b497-41bb92b91043-konnectivity-ca\") pod \"konnectivity-agent-67lnf\" (UID: \"2b2bea17-ec01-4d33-b497-41bb92b91043\") " pod="kube-system/konnectivity-agent-67lnf" Apr 16 19:53:59.760563 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.757424 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c76d029a-2a7e-46c6-a287-817cea8544c3-etc-tuned\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.760563 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.757974 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c76d029a-2a7e-46c6-a287-817cea8544c3-tmp\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.760563 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.758134 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e71bfafb-1a55-495e-afc2-a4ffd47dedea-env-overrides\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.760563 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.758231 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d12d0814-9c34-4bb1-b975-721e0ecd4752-cni-binary-copy\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.760563 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.758462 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e71bfafb-1a55-495e-afc2-a4ffd47dedea-ovn-node-metrics-cert\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.760563 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.759433 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2b2bea17-ec01-4d33-b497-41bb92b91043-agent-certs\") pod \"konnectivity-agent-67lnf\" (UID: \"2b2bea17-ec01-4d33-b497-41bb92b91043\") " pod="kube-system/konnectivity-agent-67lnf" Apr 16 19:53:59.780749 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.780728 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7kw7\" (UniqueName: \"kubernetes.io/projected/a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b-kube-api-access-v7kw7\") pod \"multus-4s7gc\" (UID: \"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b\") " pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.789519 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.789492 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndfp4\" (UniqueName: \"kubernetes.io/projected/95692ec0-fe71-4218-8b62-1622b9caabaa-kube-api-access-ndfp4\") pod \"iptables-alerter-z5st5\" (UID: \"95692ec0-fe71-4218-8b62-1622b9caabaa\") " pod="openshift-network-operator/iptables-alerter-z5st5" Apr 16 19:53:59.793547 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.793523 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bfrd\" (UniqueName: \"kubernetes.io/projected/5cd7f671-aebd-4643-9419-5f6ae541d400-kube-api-access-2bfrd\") pod \"aws-ebs-csi-driver-node-8gd2h\" (UID: \"5cd7f671-aebd-4643-9419-5f6ae541d400\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.794591 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.794565 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6rjw\" (UniqueName: \"kubernetes.io/projected/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-kube-api-access-p6rjw\") pod \"network-metrics-daemon-vx6n5\" (UID: \"5b25a2cc-2c1d-4c3d-93ff-f73223624d78\") " pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:53:59.795197 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:59.794996 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:59.795197 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:59.795023 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:59.795197 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:59.795038 2561 projected.go:194] Error preparing data for projected volume kube-api-access-5fbcj for pod openshift-network-diagnostics/network-check-target-qq6lx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:59.795197 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:53:59.795113 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj podName:10a9f71c-fe60-4983-820b-1a8007ba1863 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:00.295094562 +0000 UTC m=+3.134899961 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5fbcj" (UniqueName: "kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj") pod "network-check-target-qq6lx" (UID: "10a9f71c-fe60-4983-820b-1a8007ba1863") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:59.796872 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.796846 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc49t\" (UniqueName: \"kubernetes.io/projected/d12d0814-9c34-4bb1-b975-721e0ecd4752-kube-api-access-sc49t\") pod \"multus-additional-cni-plugins-w96qg\" (UID: \"d12d0814-9c34-4bb1-b975-721e0ecd4752\") " pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.797407 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.797385 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tgbf\" (UniqueName: \"kubernetes.io/projected/c76d029a-2a7e-46c6-a287-817cea8544c3-kube-api-access-5tgbf\") pod \"tuned-d2hvw\" (UID: \"c76d029a-2a7e-46c6-a287-817cea8544c3\") " pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.797485 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.797394 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc7pj\" (UniqueName: \"kubernetes.io/projected/e71bfafb-1a55-495e-afc2-a4ffd47dedea-kube-api-access-mc7pj\") pod \"ovnkube-node-2px7v\" (UID: \"e71bfafb-1a55-495e-afc2-a4ffd47dedea\") " pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.798435 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.798417 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffm5j\" (UniqueName: \"kubernetes.io/projected/f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55-kube-api-access-ffm5j\") pod \"node-ca-kzf5d\" (UID: \"f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55\") " pod="openshift-image-registry/node-ca-kzf5d" Apr 16 19:53:59.935852 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.935759 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w96qg" Apr 16 19:53:59.942713 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.942689 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:53:59.952370 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.952347 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-67lnf" Apr 16 19:53:59.959037 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.959019 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" Apr 16 19:53:59.971863 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.971842 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" Apr 16 19:53:59.978426 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.978405 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4s7gc" Apr 16 19:53:59.984949 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.984929 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-z5st5" Apr 16 19:53:59.990435 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:53:59.990419 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kzf5d" Apr 16 19:54:00.260623 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:00.260530 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs\") pod \"network-metrics-daemon-vx6n5\" (UID: \"5b25a2cc-2c1d-4c3d-93ff-f73223624d78\") " pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:00.260789 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:00.260682 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:00.260789 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:00.260759 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs podName:5b25a2cc-2c1d-4c3d-93ff-f73223624d78 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:01.260744032 +0000 UTC m=+4.100549410 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs") pod "network-metrics-daemon-vx6n5" (UID: "5b25a2cc-2c1d-4c3d-93ff-f73223624d78") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:00.330505 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:54:00.330476 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc76d029a_2a7e_46c6_a287_817cea8544c3.slice/crio-4e86812d583f6d61216886b71433e6809d86f9f7e99d539ee15e1c4b1ffe5454 WatchSource:0}: Error finding container 4e86812d583f6d61216886b71433e6809d86f9f7e99d539ee15e1c4b1ffe5454: Status 404 returned error can't find the container with id 4e86812d583f6d61216886b71433e6809d86f9f7e99d539ee15e1c4b1ffe5454 Apr 16 19:54:00.332049 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:54:00.332025 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode71bfafb_1a55_495e_afc2_a4ffd47dedea.slice/crio-0c1d7e453ce92fafa35a70b2744e02b544e43e1c6b0173e230f2578ac9130178 WatchSource:0}: Error finding container 0c1d7e453ce92fafa35a70b2744e02b544e43e1c6b0173e230f2578ac9130178: Status 404 returned error can't find the container with id 0c1d7e453ce92fafa35a70b2744e02b544e43e1c6b0173e230f2578ac9130178 Apr 16 19:54:00.335144 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:54:00.335122 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd12d0814_9c34_4bb1_b975_721e0ecd4752.slice/crio-c9416c20b5667face7407fff7657f6aa8a9141db8afa1167dcebc71da21b5be6 WatchSource:0}: Error finding container c9416c20b5667face7407fff7657f6aa8a9141db8afa1167dcebc71da21b5be6: Status 404 returned error can't find the container with id c9416c20b5667face7407fff7657f6aa8a9141db8afa1167dcebc71da21b5be6 Apr 16 19:54:00.335760 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:54:00.335741 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0fbaacd_1e65_4ba6_a6df_3fd552c3bb55.slice/crio-97b3bd10b0db3ff04c7f81a3c4b97b6ec6f386aed49c4ae4939b18b10418de0a WatchSource:0}: Error finding container 97b3bd10b0db3ff04c7f81a3c4b97b6ec6f386aed49c4ae4939b18b10418de0a: Status 404 returned error can't find the container with id 97b3bd10b0db3ff04c7f81a3c4b97b6ec6f386aed49c4ae4939b18b10418de0a Apr 16 19:54:00.336460 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:54:00.336335 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cd7f671_aebd_4643_9419_5f6ae541d400.slice/crio-caeb333a24ecdee8e5ec4b668c798063304a9bd6df00859ecb528c336874db55 WatchSource:0}: Error finding container caeb333a24ecdee8e5ec4b668c798063304a9bd6df00859ecb528c336874db55: Status 404 returned error can't find the container with id caeb333a24ecdee8e5ec4b668c798063304a9bd6df00859ecb528c336874db55 Apr 16 19:54:00.337784 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:54:00.337671 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3b69b87_4fb1_45dc_ba8f_f3e5ee3ef11b.slice/crio-ce70558856867e2cfc8fe0849d6e5fc0a4a0e982016ba4b891298a6e45dcd3e9 WatchSource:0}: Error finding container ce70558856867e2cfc8fe0849d6e5fc0a4a0e982016ba4b891298a6e45dcd3e9: Status 404 returned error can't find the container with id ce70558856867e2cfc8fe0849d6e5fc0a4a0e982016ba4b891298a6e45dcd3e9 Apr 16 19:54:00.339029 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:54:00.339003 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b2bea17_ec01_4d33_b497_41bb92b91043.slice/crio-e6393ecfc0f503e83917befc6d8aba3dc070510f1e06b587fe1eaa9757887a54 WatchSource:0}: Error finding container e6393ecfc0f503e83917befc6d8aba3dc070510f1e06b587fe1eaa9757887a54: Status 404 returned error can't find the container with id e6393ecfc0f503e83917befc6d8aba3dc070510f1e06b587fe1eaa9757887a54 Apr 16 19:54:00.340397 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:54:00.340369 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95692ec0_fe71_4218_8b62_1622b9caabaa.slice/crio-b1f5a4aeea19e7ea0c013fd0371e81d9b3fc95e83540b93500ece1019b7232b2 WatchSource:0}: Error finding container b1f5a4aeea19e7ea0c013fd0371e81d9b3fc95e83540b93500ece1019b7232b2: Status 404 returned error can't find the container with id b1f5a4aeea19e7ea0c013fd0371e81d9b3fc95e83540b93500ece1019b7232b2 Apr 16 19:54:00.360870 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:00.360848 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fbcj\" (UniqueName: \"kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj\") pod \"network-check-target-qq6lx\" (UID: \"10a9f71c-fe60-4983-820b-1a8007ba1863\") " pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:00.360987 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:00.360969 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:00.360987 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:00.360985 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:00.361096 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:00.360994 2561 projected.go:194] Error preparing data for projected volume kube-api-access-5fbcj for pod openshift-network-diagnostics/network-check-target-qq6lx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:00.361096 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:00.361035 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj podName:10a9f71c-fe60-4983-820b-1a8007ba1863 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:01.361021569 +0000 UTC m=+4.200826961 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5fbcj" (UniqueName: "kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj") pod "network-check-target-qq6lx" (UID: "10a9f71c-fe60-4983-820b-1a8007ba1863") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:00.680241 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:00.679939 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:48:58 +0000 UTC" deadline="2027-12-14 11:36:54.042595707 +0000 UTC" Apr 16 19:54:00.680241 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:00.680170 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14559h42m53.362430611s" Apr 16 19:54:00.805727 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:00.805216 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:00.805727 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:00.805357 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qq6lx" podUID="10a9f71c-fe60-4983-820b-1a8007ba1863" Apr 16 19:54:00.813242 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:00.813211 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w96qg" event={"ID":"d12d0814-9c34-4bb1-b975-721e0ecd4752","Type":"ContainerStarted","Data":"c9416c20b5667face7407fff7657f6aa8a9141db8afa1167dcebc71da21b5be6"} Apr 16 19:54:00.824430 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:00.824395 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" event={"ID":"e71bfafb-1a55-495e-afc2-a4ffd47dedea","Type":"ContainerStarted","Data":"0c1d7e453ce92fafa35a70b2744e02b544e43e1c6b0173e230f2578ac9130178"} Apr 16 19:54:00.828691 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:00.828648 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" event={"ID":"c76d029a-2a7e-46c6-a287-817cea8544c3","Type":"ContainerStarted","Data":"4e86812d583f6d61216886b71433e6809d86f9f7e99d539ee15e1c4b1ffe5454"} Apr 16 19:54:00.830354 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:00.830304 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-z5st5" event={"ID":"95692ec0-fe71-4218-8b62-1622b9caabaa","Type":"ContainerStarted","Data":"b1f5a4aeea19e7ea0c013fd0371e81d9b3fc95e83540b93500ece1019b7232b2"} Apr 16 19:54:00.833089 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:00.833037 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-67lnf" event={"ID":"2b2bea17-ec01-4d33-b497-41bb92b91043","Type":"ContainerStarted","Data":"e6393ecfc0f503e83917befc6d8aba3dc070510f1e06b587fe1eaa9757887a54"} Apr 16 19:54:00.840117 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:00.840078 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kzf5d" event={"ID":"f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55","Type":"ContainerStarted","Data":"97b3bd10b0db3ff04c7f81a3c4b97b6ec6f386aed49c4ae4939b18b10418de0a"} Apr 16 19:54:00.842825 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:00.842797 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-116.ec2.internal" event={"ID":"0eb38fade52ea6b8de006afe3c20412c","Type":"ContainerStarted","Data":"d186502ed7262451c1b7425256c73ca5087bcc6927d810526aa5232089e2db20"} Apr 16 19:54:00.847065 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:00.847039 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" event={"ID":"5cd7f671-aebd-4643-9419-5f6ae541d400","Type":"ContainerStarted","Data":"caeb333a24ecdee8e5ec4b668c798063304a9bd6df00859ecb528c336874db55"} Apr 16 19:54:00.857164 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:00.857111 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-116.ec2.internal" podStartSLOduration=2.8570969760000002 podStartE2EDuration="2.857096976s" podCreationTimestamp="2026-04-16 19:53:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:00.856615605 +0000 UTC m=+3.696421008" watchObservedRunningTime="2026-04-16 19:54:00.857096976 +0000 UTC m=+3.696902380" Apr 16 19:54:00.857965 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:00.857924 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4s7gc" event={"ID":"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b","Type":"ContainerStarted","Data":"ce70558856867e2cfc8fe0849d6e5fc0a4a0e982016ba4b891298a6e45dcd3e9"} Apr 16 19:54:01.267951 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:01.267916 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs\") pod \"network-metrics-daemon-vx6n5\" (UID: \"5b25a2cc-2c1d-4c3d-93ff-f73223624d78\") " pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:01.268121 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:01.268074 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:01.268181 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:01.268136 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs podName:5b25a2cc-2c1d-4c3d-93ff-f73223624d78 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:03.268116836 +0000 UTC m=+6.107922221 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs") pod "network-metrics-daemon-vx6n5" (UID: "5b25a2cc-2c1d-4c3d-93ff-f73223624d78") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:01.368951 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:01.368917 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fbcj\" (UniqueName: \"kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj\") pod \"network-check-target-qq6lx\" (UID: \"10a9f71c-fe60-4983-820b-1a8007ba1863\") " pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:01.369127 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:01.369103 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:01.369226 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:01.369138 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:01.369226 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:01.369153 2561 projected.go:194] Error preparing data for projected volume kube-api-access-5fbcj for pod openshift-network-diagnostics/network-check-target-qq6lx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:01.369226 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:01.369222 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj podName:10a9f71c-fe60-4983-820b-1a8007ba1863 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:03.369201823 +0000 UTC m=+6.209007208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5fbcj" (UniqueName: "kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj") pod "network-check-target-qq6lx" (UID: "10a9f71c-fe60-4983-820b-1a8007ba1863") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:01.805445 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:01.805411 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:01.805963 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:01.805559 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vx6n5" podUID="5b25a2cc-2c1d-4c3d-93ff-f73223624d78" Apr 16 19:54:01.875495 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:01.875460 2561 generic.go:358] "Generic (PLEG): container finished" podID="c6ec743fe0f5f66c8cc32b87339cf525" containerID="39cd5efe7e36e4b67d2cb23c11fb796440766437157ed369134695d1b9173689" exitCode=0 Apr 16 19:54:01.876430 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:01.876373 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal" event={"ID":"c6ec743fe0f5f66c8cc32b87339cf525","Type":"ContainerDied","Data":"39cd5efe7e36e4b67d2cb23c11fb796440766437157ed369134695d1b9173689"} Apr 16 19:54:02.804423 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:02.804388 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:02.804617 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:02.804521 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qq6lx" podUID="10a9f71c-fe60-4983-820b-1a8007ba1863" Apr 16 19:54:02.883291 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:02.882615 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal" event={"ID":"c6ec743fe0f5f66c8cc32b87339cf525","Type":"ContainerStarted","Data":"fa753447563c365a7853f1e46fd87ccdabf4fb8e64db15b3fea34ec408db327e"} Apr 16 19:54:03.284546 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:03.284443 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs\") pod \"network-metrics-daemon-vx6n5\" (UID: \"5b25a2cc-2c1d-4c3d-93ff-f73223624d78\") " pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:03.284723 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:03.284621 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:03.284723 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:03.284699 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs podName:5b25a2cc-2c1d-4c3d-93ff-f73223624d78 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:07.284680363 +0000 UTC m=+10.124485742 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs") pod "network-metrics-daemon-vx6n5" (UID: "5b25a2cc-2c1d-4c3d-93ff-f73223624d78") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:03.385871 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:03.385828 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fbcj\" (UniqueName: \"kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj\") pod \"network-check-target-qq6lx\" (UID: \"10a9f71c-fe60-4983-820b-1a8007ba1863\") " pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:03.386057 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:03.386010 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:03.386057 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:03.386028 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:03.386057 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:03.386042 2561 projected.go:194] Error preparing data for projected volume kube-api-access-5fbcj for pod openshift-network-diagnostics/network-check-target-qq6lx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:03.386211 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:03.386101 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj podName:10a9f71c-fe60-4983-820b-1a8007ba1863 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:07.386081066 +0000 UTC m=+10.225886464 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5fbcj" (UniqueName: "kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj") pod "network-check-target-qq6lx" (UID: "10a9f71c-fe60-4983-820b-1a8007ba1863") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:03.805467 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:03.805335 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:03.805659 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:03.805500 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vx6n5" podUID="5b25a2cc-2c1d-4c3d-93ff-f73223624d78" Apr 16 19:54:04.804775 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:04.804577 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:04.804775 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:04.804711 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qq6lx" podUID="10a9f71c-fe60-4983-820b-1a8007ba1863" Apr 16 19:54:05.804450 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:05.804411 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:05.804693 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:05.804565 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vx6n5" podUID="5b25a2cc-2c1d-4c3d-93ff-f73223624d78" Apr 16 19:54:06.805147 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:06.805112 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:06.805596 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:06.805264 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qq6lx" podUID="10a9f71c-fe60-4983-820b-1a8007ba1863" Apr 16 19:54:07.317266 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:07.316680 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs\") pod \"network-metrics-daemon-vx6n5\" (UID: \"5b25a2cc-2c1d-4c3d-93ff-f73223624d78\") " pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:07.317266 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:07.316822 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:07.317266 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:07.316905 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs podName:5b25a2cc-2c1d-4c3d-93ff-f73223624d78 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:15.316884312 +0000 UTC m=+18.156689696 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs") pod "network-metrics-daemon-vx6n5" (UID: "5b25a2cc-2c1d-4c3d-93ff-f73223624d78") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:07.417371 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:07.417328 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fbcj\" (UniqueName: \"kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj\") pod \"network-check-target-qq6lx\" (UID: \"10a9f71c-fe60-4983-820b-1a8007ba1863\") " pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:07.417542 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:07.417489 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:07.417542 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:07.417508 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:07.417542 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:07.417521 2561 projected.go:194] Error preparing data for projected volume kube-api-access-5fbcj for pod openshift-network-diagnostics/network-check-target-qq6lx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:07.417702 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:07.417584 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj podName:10a9f71c-fe60-4983-820b-1a8007ba1863 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:15.417565105 +0000 UTC m=+18.257370490 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5fbcj" (UniqueName: "kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj") pod "network-check-target-qq6lx" (UID: "10a9f71c-fe60-4983-820b-1a8007ba1863") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:07.527276 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:07.527211 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-116.ec2.internal" podStartSLOduration=9.527196175 podStartE2EDuration="9.527196175s" podCreationTimestamp="2026-04-16 19:53:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:02.897567373 +0000 UTC m=+5.737372775" watchObservedRunningTime="2026-04-16 19:54:07.527196175 +0000 UTC m=+10.367001576" Apr 16 19:54:07.528337 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:07.528286 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-z89qz"] Apr 16 19:54:07.530887 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:07.530844 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:07.531004 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:07.530923 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z89qz" podUID="3ada28f2-9643-4885-b86d-53b74a05e6a5" Apr 16 19:54:07.619620 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:07.619536 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3ada28f2-9643-4885-b86d-53b74a05e6a5-kubelet-config\") pod \"global-pull-secret-syncer-z89qz\" (UID: \"3ada28f2-9643-4885-b86d-53b74a05e6a5\") " pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:07.619620 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:07.619595 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret\") pod \"global-pull-secret-syncer-z89qz\" (UID: \"3ada28f2-9643-4885-b86d-53b74a05e6a5\") " pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:07.619802 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:07.619629 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3ada28f2-9643-4885-b86d-53b74a05e6a5-dbus\") pod \"global-pull-secret-syncer-z89qz\" (UID: \"3ada28f2-9643-4885-b86d-53b74a05e6a5\") " pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:07.720208 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:07.720171 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret\") pod \"global-pull-secret-syncer-z89qz\" (UID: \"3ada28f2-9643-4885-b86d-53b74a05e6a5\") " pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:07.720397 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:07.720238 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3ada28f2-9643-4885-b86d-53b74a05e6a5-dbus\") pod \"global-pull-secret-syncer-z89qz\" (UID: \"3ada28f2-9643-4885-b86d-53b74a05e6a5\") " pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:07.720397 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:07.720344 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3ada28f2-9643-4885-b86d-53b74a05e6a5-kubelet-config\") pod \"global-pull-secret-syncer-z89qz\" (UID: \"3ada28f2-9643-4885-b86d-53b74a05e6a5\") " pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:07.720397 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:07.720370 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:07.720570 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:07.720440 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret podName:3ada28f2-9643-4885-b86d-53b74a05e6a5 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:08.220419686 +0000 UTC m=+11.060225077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret") pod "global-pull-secret-syncer-z89qz" (UID: "3ada28f2-9643-4885-b86d-53b74a05e6a5") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:07.720570 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:07.720441 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3ada28f2-9643-4885-b86d-53b74a05e6a5-kubelet-config\") pod \"global-pull-secret-syncer-z89qz\" (UID: \"3ada28f2-9643-4885-b86d-53b74a05e6a5\") " pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:07.720570 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:07.720567 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3ada28f2-9643-4885-b86d-53b74a05e6a5-dbus\") pod \"global-pull-secret-syncer-z89qz\" (UID: \"3ada28f2-9643-4885-b86d-53b74a05e6a5\") " pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:07.806158 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:07.806126 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:07.806605 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:07.806264 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vx6n5" podUID="5b25a2cc-2c1d-4c3d-93ff-f73223624d78" Apr 16 19:54:08.225358 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:08.225321 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret\") pod \"global-pull-secret-syncer-z89qz\" (UID: \"3ada28f2-9643-4885-b86d-53b74a05e6a5\") " pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:08.225537 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:08.225486 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:08.225606 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:08.225565 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret podName:3ada28f2-9643-4885-b86d-53b74a05e6a5 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:09.225546007 +0000 UTC m=+12.065351392 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret") pod "global-pull-secret-syncer-z89qz" (UID: "3ada28f2-9643-4885-b86d-53b74a05e6a5") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:08.805333 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:08.805295 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:08.805529 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:08.805295 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:08.805529 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:08.805419 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qq6lx" podUID="10a9f71c-fe60-4983-820b-1a8007ba1863" Apr 16 19:54:08.805660 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:08.805521 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z89qz" podUID="3ada28f2-9643-4885-b86d-53b74a05e6a5" Apr 16 19:54:09.232116 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:09.232033 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret\") pod \"global-pull-secret-syncer-z89qz\" (UID: \"3ada28f2-9643-4885-b86d-53b74a05e6a5\") " pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:09.232571 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:09.232167 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:09.232571 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:09.232238 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret podName:3ada28f2-9643-4885-b86d-53b74a05e6a5 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:11.232217988 +0000 UTC m=+14.072023374 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret") pod "global-pull-secret-syncer-z89qz" (UID: "3ada28f2-9643-4885-b86d-53b74a05e6a5") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:09.805267 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:09.805223 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:09.805464 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:09.805383 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vx6n5" podUID="5b25a2cc-2c1d-4c3d-93ff-f73223624d78" Apr 16 19:54:10.804762 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:10.804731 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:10.805218 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:10.804731 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:10.805218 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:10.804853 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qq6lx" podUID="10a9f71c-fe60-4983-820b-1a8007ba1863" Apr 16 19:54:10.805218 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:10.804940 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z89qz" podUID="3ada28f2-9643-4885-b86d-53b74a05e6a5" Apr 16 19:54:11.248740 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:11.248653 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret\") pod \"global-pull-secret-syncer-z89qz\" (UID: \"3ada28f2-9643-4885-b86d-53b74a05e6a5\") " pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:11.248901 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:11.248784 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:11.248901 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:11.248844 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret podName:3ada28f2-9643-4885-b86d-53b74a05e6a5 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:15.248831244 +0000 UTC m=+18.088636622 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret") pod "global-pull-secret-syncer-z89qz" (UID: "3ada28f2-9643-4885-b86d-53b74a05e6a5") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:11.639651 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:11.639567 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-xlktd"] Apr 16 19:54:11.642585 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:11.642553 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xlktd" Apr 16 19:54:11.645680 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:11.645660 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-x284t\"" Apr 16 19:54:11.645802 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:11.645720 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 19:54:11.647592 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:11.647559 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 19:54:11.751595 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:11.751559 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hfcj\" (UniqueName: \"kubernetes.io/projected/487cb337-b28d-43b2-8430-bacf43a71449-kube-api-access-4hfcj\") pod \"node-resolver-xlktd\" (UID: \"487cb337-b28d-43b2-8430-bacf43a71449\") " pod="openshift-dns/node-resolver-xlktd" Apr 16 19:54:11.751761 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:11.751602 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/487cb337-b28d-43b2-8430-bacf43a71449-tmp-dir\") pod \"node-resolver-xlktd\" (UID: \"487cb337-b28d-43b2-8430-bacf43a71449\") " pod="openshift-dns/node-resolver-xlktd" Apr 16 19:54:11.751761 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:11.751627 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/487cb337-b28d-43b2-8430-bacf43a71449-hosts-file\") pod \"node-resolver-xlktd\" (UID: \"487cb337-b28d-43b2-8430-bacf43a71449\") " pod="openshift-dns/node-resolver-xlktd" Apr 16 19:54:11.807288 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:11.807261 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:11.807699 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:11.807401 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vx6n5" podUID="5b25a2cc-2c1d-4c3d-93ff-f73223624d78" Apr 16 19:54:11.852936 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:11.852899 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hfcj\" (UniqueName: \"kubernetes.io/projected/487cb337-b28d-43b2-8430-bacf43a71449-kube-api-access-4hfcj\") pod \"node-resolver-xlktd\" (UID: \"487cb337-b28d-43b2-8430-bacf43a71449\") " pod="openshift-dns/node-resolver-xlktd" Apr 16 19:54:11.852936 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:11.852944 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/487cb337-b28d-43b2-8430-bacf43a71449-tmp-dir\") pod \"node-resolver-xlktd\" (UID: \"487cb337-b28d-43b2-8430-bacf43a71449\") " pod="openshift-dns/node-resolver-xlktd" Apr 16 19:54:11.853178 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:11.852979 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/487cb337-b28d-43b2-8430-bacf43a71449-hosts-file\") pod \"node-resolver-xlktd\" (UID: \"487cb337-b28d-43b2-8430-bacf43a71449\") " pod="openshift-dns/node-resolver-xlktd" Apr 16 19:54:11.853178 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:11.853050 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/487cb337-b28d-43b2-8430-bacf43a71449-hosts-file\") pod \"node-resolver-xlktd\" (UID: \"487cb337-b28d-43b2-8430-bacf43a71449\") " pod="openshift-dns/node-resolver-xlktd" Apr 16 19:54:11.853322 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:11.853303 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/487cb337-b28d-43b2-8430-bacf43a71449-tmp-dir\") pod \"node-resolver-xlktd\" (UID: \"487cb337-b28d-43b2-8430-bacf43a71449\") " pod="openshift-dns/node-resolver-xlktd" Apr 16 19:54:11.864099 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:11.864070 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hfcj\" (UniqueName: \"kubernetes.io/projected/487cb337-b28d-43b2-8430-bacf43a71449-kube-api-access-4hfcj\") pod \"node-resolver-xlktd\" (UID: \"487cb337-b28d-43b2-8430-bacf43a71449\") " pod="openshift-dns/node-resolver-xlktd" Apr 16 19:54:11.952212 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:11.952127 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xlktd" Apr 16 19:54:12.805364 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:12.805279 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:12.805614 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:12.805279 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:12.805614 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:12.805415 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qq6lx" podUID="10a9f71c-fe60-4983-820b-1a8007ba1863" Apr 16 19:54:12.805614 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:12.805492 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z89qz" podUID="3ada28f2-9643-4885-b86d-53b74a05e6a5" Apr 16 19:54:13.805034 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:13.804993 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:13.805504 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:13.805118 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vx6n5" podUID="5b25a2cc-2c1d-4c3d-93ff-f73223624d78" Apr 16 19:54:14.805243 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:14.805202 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:14.805737 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:14.805202 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:14.805737 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:14.805339 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qq6lx" podUID="10a9f71c-fe60-4983-820b-1a8007ba1863" Apr 16 19:54:14.805737 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:14.805438 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z89qz" podUID="3ada28f2-9643-4885-b86d-53b74a05e6a5" Apr 16 19:54:15.275948 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:15.275847 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret\") pod \"global-pull-secret-syncer-z89qz\" (UID: \"3ada28f2-9643-4885-b86d-53b74a05e6a5\") " pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:15.276112 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:15.275998 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:15.276112 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:15.276077 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret podName:3ada28f2-9643-4885-b86d-53b74a05e6a5 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:23.276055023 +0000 UTC m=+26.115860403 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret") pod "global-pull-secret-syncer-z89qz" (UID: "3ada28f2-9643-4885-b86d-53b74a05e6a5") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:15.376759 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:15.376722 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs\") pod \"network-metrics-daemon-vx6n5\" (UID: \"5b25a2cc-2c1d-4c3d-93ff-f73223624d78\") " pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:15.376941 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:15.376897 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:15.377010 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:15.376983 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs podName:5b25a2cc-2c1d-4c3d-93ff-f73223624d78 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:31.376961262 +0000 UTC m=+34.216766662 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs") pod "network-metrics-daemon-vx6n5" (UID: "5b25a2cc-2c1d-4c3d-93ff-f73223624d78") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:15.477863 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:15.477817 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fbcj\" (UniqueName: \"kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj\") pod \"network-check-target-qq6lx\" (UID: \"10a9f71c-fe60-4983-820b-1a8007ba1863\") " pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:15.478032 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:15.478005 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:15.478032 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:15.478029 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:15.478128 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:15.478043 2561 projected.go:194] Error preparing data for projected volume kube-api-access-5fbcj for pod openshift-network-diagnostics/network-check-target-qq6lx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:15.478128 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:15.478110 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj podName:10a9f71c-fe60-4983-820b-1a8007ba1863 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:31.478089561 +0000 UTC m=+34.317894959 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5fbcj" (UniqueName: "kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj") pod "network-check-target-qq6lx" (UID: "10a9f71c-fe60-4983-820b-1a8007ba1863") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:15.804958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:15.804925 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:15.805123 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:15.805059 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vx6n5" podUID="5b25a2cc-2c1d-4c3d-93ff-f73223624d78" Apr 16 19:54:16.632741 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:54:16.632713 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod487cb337_b28d_43b2_8430_bacf43a71449.slice/crio-cfa64d2efb938614cddc7a23278f93e19aef1a7f844d477b0ef3de47f0ac0ff3 WatchSource:0}: Error finding container cfa64d2efb938614cddc7a23278f93e19aef1a7f844d477b0ef3de47f0ac0ff3: Status 404 returned error can't find the container with id cfa64d2efb938614cddc7a23278f93e19aef1a7f844d477b0ef3de47f0ac0ff3 Apr 16 19:54:16.805331 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:16.805059 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:16.805828 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:16.805447 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qq6lx" podUID="10a9f71c-fe60-4983-820b-1a8007ba1863" Apr 16 19:54:16.805828 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:16.805535 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:16.805828 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:16.805645 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z89qz" podUID="3ada28f2-9643-4885-b86d-53b74a05e6a5" Apr 16 19:54:16.907647 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:16.907618 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" event={"ID":"5cd7f671-aebd-4643-9419-5f6ae541d400","Type":"ContainerStarted","Data":"32220c75c812b2b11e16596fea3f6409021cb8f70f2987c72960ce9ff5c39db4"} Apr 16 19:54:16.909335 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:16.909023 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4s7gc" event={"ID":"a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b","Type":"ContainerStarted","Data":"93ecf802f14e95162e801735d489083215763d8ce520ee0ce34a37538ab6243c"} Apr 16 19:54:16.910727 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:16.910696 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w96qg" event={"ID":"d12d0814-9c34-4bb1-b975-721e0ecd4752","Type":"ContainerStarted","Data":"da89102a480cfd98a83518ef23036d5636cd1c241bd2539d12bd033b38b7fb46"} Apr 16 19:54:16.912108 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:16.912066 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" event={"ID":"e71bfafb-1a55-495e-afc2-a4ffd47dedea","Type":"ContainerStarted","Data":"98554d09770d94084deb504294126730606784dc762296dd867a0437365720dc"} Apr 16 19:54:16.913863 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:16.913386 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" event={"ID":"c76d029a-2a7e-46c6-a287-817cea8544c3","Type":"ContainerStarted","Data":"df21141b72afc2ad2d27fa1ccbdf9daedf3b740be94b3b314724cbb1c353e2ad"} Apr 16 19:54:16.916086 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:16.915109 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xlktd" event={"ID":"487cb337-b28d-43b2-8430-bacf43a71449","Type":"ContainerStarted","Data":"cfa64d2efb938614cddc7a23278f93e19aef1a7f844d477b0ef3de47f0ac0ff3"} Apr 16 19:54:16.916563 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:16.916531 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-67lnf" event={"ID":"2b2bea17-ec01-4d33-b497-41bb92b91043","Type":"ContainerStarted","Data":"af51e4799970e56e0130a1ada92983921e3f86860a2abf67b6912e406be4e8ae"} Apr 16 19:54:16.918092 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:16.918070 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kzf5d" event={"ID":"f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55","Type":"ContainerStarted","Data":"7dd04a134e16bab4a1b7291749c6e03b3fe1319eb1ab5f73c46460e188c72f2e"} Apr 16 19:54:16.927782 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:16.927731 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4s7gc" podStartSLOduration=3.595820348 podStartE2EDuration="19.927721679s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:54:00.341580166 +0000 UTC m=+3.181385546" lastFinishedPulling="2026-04-16 19:54:16.673481498 +0000 UTC m=+19.513286877" observedRunningTime="2026-04-16 19:54:16.927403212 +0000 UTC m=+19.767208625" watchObservedRunningTime="2026-04-16 19:54:16.927721679 +0000 UTC m=+19.767527079" Apr 16 19:54:16.941263 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:16.941206 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-67lnf" podStartSLOduration=3.6636739499999997 podStartE2EDuration="19.941194754s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:54:00.344343857 +0000 UTC m=+3.184149243" lastFinishedPulling="2026-04-16 19:54:16.621864659 +0000 UTC m=+19.461670047" observedRunningTime="2026-04-16 19:54:16.940806616 +0000 UTC m=+19.780612017" watchObservedRunningTime="2026-04-16 19:54:16.941194754 +0000 UTC m=+19.781000154" Apr 16 19:54:16.958625 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:16.958592 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-d2hvw" podStartSLOduration=3.670366718 podStartE2EDuration="19.958581192s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:54:00.3336108 +0000 UTC m=+3.173416179" lastFinishedPulling="2026-04-16 19:54:16.621825259 +0000 UTC m=+19.461630653" observedRunningTime="2026-04-16 19:54:16.958445906 +0000 UTC m=+19.798251297" watchObservedRunningTime="2026-04-16 19:54:16.958581192 +0000 UTC m=+19.798386592" Apr 16 19:54:16.975593 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:16.975556 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kzf5d" podStartSLOduration=3.69128522 podStartE2EDuration="19.975544595s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:54:00.337591084 +0000 UTC m=+3.177396462" lastFinishedPulling="2026-04-16 19:54:16.621850452 +0000 UTC m=+19.461655837" observedRunningTime="2026-04-16 19:54:16.975153081 +0000 UTC m=+19.814958493" watchObservedRunningTime="2026-04-16 19:54:16.975544595 +0000 UTC m=+19.815349996" Apr 16 19:54:17.445522 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:17.445266 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-67lnf" Apr 16 19:54:17.446006 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:17.445915 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-67lnf" Apr 16 19:54:17.805069 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:17.804986 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:17.805913 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:17.805076 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vx6n5" podUID="5b25a2cc-2c1d-4c3d-93ff-f73223624d78" Apr 16 19:54:17.922313 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:17.922271 2561 generic.go:358] "Generic (PLEG): container finished" podID="d12d0814-9c34-4bb1-b975-721e0ecd4752" containerID="da89102a480cfd98a83518ef23036d5636cd1c241bd2539d12bd033b38b7fb46" exitCode=0 Apr 16 19:54:17.922474 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:17.922360 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w96qg" event={"ID":"d12d0814-9c34-4bb1-b975-721e0ecd4752","Type":"ContainerDied","Data":"da89102a480cfd98a83518ef23036d5636cd1c241bd2539d12bd033b38b7fb46"} Apr 16 19:54:17.924876 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:17.924857 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/ovn-acl-logging/0.log" Apr 16 19:54:17.925133 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:17.925116 2561 generic.go:358] "Generic (PLEG): container finished" podID="e71bfafb-1a55-495e-afc2-a4ffd47dedea" containerID="b1af1d73d53f91f044fc0521133f22132c677c4535e6785bccdac082fe74b2ad" exitCode=1 Apr 16 19:54:17.925194 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:17.925174 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" event={"ID":"e71bfafb-1a55-495e-afc2-a4ffd47dedea","Type":"ContainerStarted","Data":"ef440b977b8cc8541c77a006b7374e8e67a161c5eaf50bd28f61b705e17d29db"} Apr 16 19:54:17.925229 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:17.925202 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" event={"ID":"e71bfafb-1a55-495e-afc2-a4ffd47dedea","Type":"ContainerStarted","Data":"e4741314d4c00e10a728bb366b4ce74fa093fcbd39b7c34e5c66636f14e54573"} Apr 16 19:54:17.925229 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:17.925215 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" event={"ID":"e71bfafb-1a55-495e-afc2-a4ffd47dedea","Type":"ContainerStarted","Data":"933c4393c308c52f731047d08b43c03f8c9ccbe89bc598b4c3389703d77f69c2"} Apr 16 19:54:17.925318 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:17.925226 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" event={"ID":"e71bfafb-1a55-495e-afc2-a4ffd47dedea","Type":"ContainerStarted","Data":"de4284c5631ffdec4a39d63ebcd8cd2d53df0943f069a241d5e06a84d26c0c0d"} Apr 16 19:54:17.925318 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:17.925239 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" event={"ID":"e71bfafb-1a55-495e-afc2-a4ffd47dedea","Type":"ContainerDied","Data":"b1af1d73d53f91f044fc0521133f22132c677c4535e6785bccdac082fe74b2ad"} Apr 16 19:54:17.926345 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:17.926321 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xlktd" event={"ID":"487cb337-b28d-43b2-8430-bacf43a71449","Type":"ContainerStarted","Data":"65eb35d314d3353cc325bdcce991aaa56c4ebdf9460c42085b4484f5a122102f"} Apr 16 19:54:17.927559 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:17.927535 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-z5st5" event={"ID":"95692ec0-fe71-4218-8b62-1622b9caabaa","Type":"ContainerStarted","Data":"15c4cd8c642cd8c639a17f6ff4156236aa9a70e971fe4036ad153cc513e785c1"} Apr 16 19:54:17.953142 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:17.953100 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xlktd" podStartSLOduration=6.953086968 podStartE2EDuration="6.953086968s" podCreationTimestamp="2026-04-16 19:54:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:17.952964083 +0000 UTC m=+20.792769484" watchObservedRunningTime="2026-04-16 19:54:17.953086968 +0000 UTC m=+20.792892369" Apr 16 19:54:17.964742 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:17.964694 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-z5st5" podStartSLOduration=4.686678356 podStartE2EDuration="20.964680436s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:54:00.343812408 +0000 UTC m=+3.183617793" lastFinishedPulling="2026-04-16 19:54:16.62181449 +0000 UTC m=+19.461619873" observedRunningTime="2026-04-16 19:54:17.964389326 +0000 UTC m=+20.804194725" watchObservedRunningTime="2026-04-16 19:54:17.964680436 +0000 UTC m=+20.804485837" Apr 16 19:54:18.175641 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:18.175619 2561 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 19:54:18.705025 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:18.704905 2561 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T19:54:18.175636282Z","UUID":"c784a4fd-661b-4d7b-b8a2-c270dac96312","Handler":null,"Name":"","Endpoint":""} Apr 16 19:54:18.708427 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:18.708401 2561 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 19:54:18.708427 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:18.708432 2561 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 19:54:18.804762 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:18.804563 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:18.804762 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:18.804573 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:18.804762 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:18.804693 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qq6lx" podUID="10a9f71c-fe60-4983-820b-1a8007ba1863" Apr 16 19:54:18.804762 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:18.804752 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z89qz" podUID="3ada28f2-9643-4885-b86d-53b74a05e6a5" Apr 16 19:54:18.932151 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:18.932113 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" event={"ID":"5cd7f671-aebd-4643-9419-5f6ae541d400","Type":"ContainerStarted","Data":"14c1a4f263978aa1a1549f7beab9dd4bd890fcedf54f02e6adf2fe6e2a5306b7"} Apr 16 19:54:18.932151 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:18.932155 2561 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:54:19.805274 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:19.805031 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:19.805466 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:19.805321 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vx6n5" podUID="5b25a2cc-2c1d-4c3d-93ff-f73223624d78" Apr 16 19:54:19.936808 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:19.936771 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" event={"ID":"5cd7f671-aebd-4643-9419-5f6ae541d400","Type":"ContainerStarted","Data":"1784b530701651e2b5420a858dce5a059ad7ed9b5cc805eda4995c082234b501"} Apr 16 19:54:19.939943 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:19.939920 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/ovn-acl-logging/0.log" Apr 16 19:54:19.940296 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:19.940275 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" event={"ID":"e71bfafb-1a55-495e-afc2-a4ffd47dedea","Type":"ContainerStarted","Data":"5feb55c4459007bdeb1c286d5efd78bf11f06061b12a65db4f50bf01d66dfcb2"} Apr 16 19:54:19.968324 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:19.968272 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8gd2h" podStartSLOduration=3.86034603 podStartE2EDuration="22.968241083s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:54:00.338103888 +0000 UTC m=+3.177909272" lastFinishedPulling="2026-04-16 19:54:19.445998946 +0000 UTC m=+22.285804325" observedRunningTime="2026-04-16 19:54:19.967442613 +0000 UTC m=+22.807248016" watchObservedRunningTime="2026-04-16 19:54:19.968241083 +0000 UTC m=+22.808046486" Apr 16 19:54:20.804264 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:20.804218 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:20.804539 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:20.804218 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:20.804539 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:20.804340 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z89qz" podUID="3ada28f2-9643-4885-b86d-53b74a05e6a5" Apr 16 19:54:20.804539 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:20.804412 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qq6lx" podUID="10a9f71c-fe60-4983-820b-1a8007ba1863" Apr 16 19:54:21.805261 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:21.805214 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:21.805718 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:21.805354 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vx6n5" podUID="5b25a2cc-2c1d-4c3d-93ff-f73223624d78" Apr 16 19:54:21.947349 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:21.947314 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/ovn-acl-logging/0.log" Apr 16 19:54:21.947762 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:21.947725 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" event={"ID":"e71bfafb-1a55-495e-afc2-a4ffd47dedea","Type":"ContainerStarted","Data":"5ac9f930b520a0a709bd3da81afaa028720d2824e8a0519889c1fec628bed6c9"} Apr 16 19:54:21.949914 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:21.949739 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:54:21.949914 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:21.949816 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:54:21.950154 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:21.949936 2561 scope.go:117] "RemoveContainer" containerID="b1af1d73d53f91f044fc0521133f22132c677c4535e6785bccdac082fe74b2ad" Apr 16 19:54:21.968984 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:21.968748 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:54:22.804598 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:22.804410 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:22.804782 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:22.804449 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:22.804782 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:22.804684 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qq6lx" podUID="10a9f71c-fe60-4983-820b-1a8007ba1863" Apr 16 19:54:22.804782 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:22.804738 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z89qz" podUID="3ada28f2-9643-4885-b86d-53b74a05e6a5" Apr 16 19:54:22.950938 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:22.950909 2561 generic.go:358] "Generic (PLEG): container finished" podID="d12d0814-9c34-4bb1-b975-721e0ecd4752" containerID="80c5a130649c56a11a5da2afbc96972d8974d43d480e764b41c187001446b10a" exitCode=0 Apr 16 19:54:22.951659 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:22.950982 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w96qg" event={"ID":"d12d0814-9c34-4bb1-b975-721e0ecd4752","Type":"ContainerDied","Data":"80c5a130649c56a11a5da2afbc96972d8974d43d480e764b41c187001446b10a"} Apr 16 19:54:22.954364 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:22.954343 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/ovn-acl-logging/0.log" Apr 16 19:54:22.954661 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:22.954641 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" event={"ID":"e71bfafb-1a55-495e-afc2-a4ffd47dedea","Type":"ContainerStarted","Data":"efd0ea915a99a7b737e44efb6769543c14c6fb9f60cfc3a8836fef7cb0217a5c"} Apr 16 19:54:22.954896 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:22.954883 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:54:22.969972 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:22.969951 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:54:23.014750 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:23.014701 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" podStartSLOduration=9.655231234 podStartE2EDuration="26.01468687s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:54:00.3339311 +0000 UTC m=+3.173736490" lastFinishedPulling="2026-04-16 19:54:16.69338674 +0000 UTC m=+19.533192126" observedRunningTime="2026-04-16 19:54:23.012786073 +0000 UTC m=+25.852591473" watchObservedRunningTime="2026-04-16 19:54:23.01468687 +0000 UTC m=+25.854492283" Apr 16 19:54:23.338786 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:23.338749 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret\") pod \"global-pull-secret-syncer-z89qz\" (UID: \"3ada28f2-9643-4885-b86d-53b74a05e6a5\") " pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:23.338953 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:23.338884 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:23.338953 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:23.338947 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret podName:3ada28f2-9643-4885-b86d-53b74a05e6a5 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:39.338929071 +0000 UTC m=+42.178734452 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret") pod "global-pull-secret-syncer-z89qz" (UID: "3ada28f2-9643-4885-b86d-53b74a05e6a5") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:23.804367 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:23.804337 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:23.804499 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:23.804448 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vx6n5" podUID="5b25a2cc-2c1d-4c3d-93ff-f73223624d78" Apr 16 19:54:23.902197 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:23.902151 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vx6n5"] Apr 16 19:54:23.902954 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:23.902924 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-z89qz"] Apr 16 19:54:23.903103 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:23.903086 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:23.903224 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:23.903191 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z89qz" podUID="3ada28f2-9643-4885-b86d-53b74a05e6a5" Apr 16 19:54:23.903873 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:23.903854 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qq6lx"] Apr 16 19:54:23.903949 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:23.903939 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:23.904020 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:23.904004 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qq6lx" podUID="10a9f71c-fe60-4983-820b-1a8007ba1863" Apr 16 19:54:23.956482 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:23.956455 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:23.956940 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:23.956576 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vx6n5" podUID="5b25a2cc-2c1d-4c3d-93ff-f73223624d78" Apr 16 19:54:24.960208 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:24.959979 2561 generic.go:358] "Generic (PLEG): container finished" podID="d12d0814-9c34-4bb1-b975-721e0ecd4752" containerID="32e5b3a63711f8ab440ccf3fd1076f4513efdeac06672eeb94ce9d218ecfdcc6" exitCode=0 Apr 16 19:54:24.960586 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:24.960057 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w96qg" event={"ID":"d12d0814-9c34-4bb1-b975-721e0ecd4752","Type":"ContainerDied","Data":"32e5b3a63711f8ab440ccf3fd1076f4513efdeac06672eeb94ce9d218ecfdcc6"} Apr 16 19:54:25.804938 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:25.804908 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:25.805075 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:25.805025 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:25.805075 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:25.805027 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vx6n5" podUID="5b25a2cc-2c1d-4c3d-93ff-f73223624d78" Apr 16 19:54:25.805075 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:25.805038 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:25.805188 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:25.805155 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qq6lx" podUID="10a9f71c-fe60-4983-820b-1a8007ba1863" Apr 16 19:54:25.805293 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:25.805264 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z89qz" podUID="3ada28f2-9643-4885-b86d-53b74a05e6a5" Apr 16 19:54:26.550932 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:26.550896 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-67lnf" Apr 16 19:54:26.551387 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:26.551058 2561 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:54:26.551552 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:26.551524 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-67lnf" Apr 16 19:54:26.965536 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:26.965447 2561 generic.go:358] "Generic (PLEG): container finished" podID="d12d0814-9c34-4bb1-b975-721e0ecd4752" containerID="fadec8fe7036b95809fe1137ecf6eadae526d60671b6ca436fa3d73780dd569d" exitCode=0 Apr 16 19:54:26.965536 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:26.965524 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w96qg" event={"ID":"d12d0814-9c34-4bb1-b975-721e0ecd4752","Type":"ContainerDied","Data":"fadec8fe7036b95809fe1137ecf6eadae526d60671b6ca436fa3d73780dd569d"} Apr 16 19:54:27.806266 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:27.806213 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:27.806266 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:27.806272 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:27.807021 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:27.806232 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:27.807021 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:27.806336 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qq6lx" podUID="10a9f71c-fe60-4983-820b-1a8007ba1863" Apr 16 19:54:27.807021 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:27.806462 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-z89qz" podUID="3ada28f2-9643-4885-b86d-53b74a05e6a5" Apr 16 19:54:27.807021 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:27.806569 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vx6n5" podUID="5b25a2cc-2c1d-4c3d-93ff-f73223624d78" Apr 16 19:54:29.512089 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.512059 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-116.ec2.internal" event="NodeReady" Apr 16 19:54:29.512687 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.512201 2561 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 19:54:29.593656 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.593621 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6454ccf87c-rjzjd"] Apr 16 19:54:29.617082 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.617045 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5jw6k"] Apr 16 19:54:29.617238 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.617194 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.619916 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.619808 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 19:54:29.620682 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.620603 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 19:54:29.620821 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.620728 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 19:54:29.622114 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.622094 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lq4lt\"" Apr 16 19:54:29.626452 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.626419 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 19:54:29.632552 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.632532 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rfcgq"] Apr 16 19:54:29.632702 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.632686 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5jw6k" Apr 16 19:54:29.636106 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.636086 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 19:54:29.636205 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.636193 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 19:54:29.636685 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.636668 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-668h7\"" Apr 16 19:54:29.656904 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.656880 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6454ccf87c-rjzjd"] Apr 16 19:54:29.656904 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.656906 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5jw6k"] Apr 16 19:54:29.657055 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.656915 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rfcgq"] Apr 16 19:54:29.657055 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.657003 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rfcgq" Apr 16 19:54:29.660155 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.660128 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 19:54:29.660155 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.660142 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 19:54:29.660329 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.660314 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 19:54:29.660849 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.660829 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xfkbf\"" Apr 16 19:54:29.693506 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.693466 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-ca-trust-extracted\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.693655 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.693519 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-image-registry-private-configuration\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.693655 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.693551 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-certificates\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.693655 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.693607 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-installation-pull-secrets\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.693772 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.693681 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-bound-sa-token\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.693772 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.693711 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-749z8\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-kube-api-access-749z8\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.693772 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.693746 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.693889 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.693844 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-trusted-ca\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.795266 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.795106 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-image-registry-private-configuration\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.795422 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.795291 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-certificates\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.795422 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.795318 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-installation-pull-secrets\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.795422 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.795349 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls\") pod \"dns-default-5jw6k\" (UID: \"28d7c952-6b2a-4308-acb9-9864f2a7d6dc\") " pod="openshift-dns/dns-default-5jw6k" Apr 16 19:54:29.795422 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.795403 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr8d4\" (UniqueName: \"kubernetes.io/projected/2a59db72-acc4-42f2-934a-6582522efbbc-kube-api-access-cr8d4\") pod \"ingress-canary-rfcgq\" (UID: \"2a59db72-acc4-42f2-934a-6582522efbbc\") " pod="openshift-ingress-canary/ingress-canary-rfcgq" Apr 16 19:54:29.795601 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.795435 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-bound-sa-token\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.795601 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.795466 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-749z8\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-kube-api-access-749z8\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.795601 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.795521 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert\") pod \"ingress-canary-rfcgq\" (UID: \"2a59db72-acc4-42f2-934a-6582522efbbc\") " pod="openshift-ingress-canary/ingress-canary-rfcgq" Apr 16 19:54:29.796049 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.795800 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.796049 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.795878 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-config-volume\") pod \"dns-default-5jw6k\" (UID: \"28d7c952-6b2a-4308-acb9-9864f2a7d6dc\") " pod="openshift-dns/dns-default-5jw6k" Apr 16 19:54:29.796049 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.795922 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79vc9\" (UniqueName: \"kubernetes.io/projected/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-kube-api-access-79vc9\") pod \"dns-default-5jw6k\" (UID: \"28d7c952-6b2a-4308-acb9-9864f2a7d6dc\") " pod="openshift-dns/dns-default-5jw6k" Apr 16 19:54:29.796049 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:29.795945 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:29.796049 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.795953 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-certificates\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.796049 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:29.795965 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6454ccf87c-rjzjd: secret "image-registry-tls" not found Apr 16 19:54:29.796049 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.795978 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-trusted-ca\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.796049 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.796003 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-tmp-dir\") pod \"dns-default-5jw6k\" (UID: \"28d7c952-6b2a-4308-acb9-9864f2a7d6dc\") " pod="openshift-dns/dns-default-5jw6k" Apr 16 19:54:29.796049 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:29.796034 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls podName:2c7aae42-1d5a-4a39-8da1-c98f99b37fd1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.296016211 +0000 UTC m=+33.135821589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls") pod "image-registry-6454ccf87c-rjzjd" (UID: "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1") : secret "image-registry-tls" not found Apr 16 19:54:29.796740 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.796147 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-ca-trust-extracted\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.796740 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.796509 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-ca-trust-extracted\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.796883 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.796863 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-trusted-ca\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.799849 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.799822 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-installation-pull-secrets\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.800048 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.799833 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-image-registry-private-configuration\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.805209 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.804496 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:29.805209 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.804633 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-749z8\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-kube-api-access-749z8\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.805209 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.804737 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:29.805209 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.804847 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-bound-sa-token\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:29.805209 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.805207 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:29.807733 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.807712 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:54:29.807875 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.807755 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:54:29.807875 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.807758 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qj6k2\"" Apr 16 19:54:29.808075 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.808060 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:54:29.808149 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.808090 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:54:29.808322 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.808303 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nnp77\"" Apr 16 19:54:29.897627 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.897532 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-config-volume\") pod \"dns-default-5jw6k\" (UID: \"28d7c952-6b2a-4308-acb9-9864f2a7d6dc\") " pod="openshift-dns/dns-default-5jw6k" Apr 16 19:54:29.897627 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.897587 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79vc9\" (UniqueName: \"kubernetes.io/projected/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-kube-api-access-79vc9\") pod \"dns-default-5jw6k\" (UID: \"28d7c952-6b2a-4308-acb9-9864f2a7d6dc\") " pod="openshift-dns/dns-default-5jw6k" Apr 16 19:54:29.897836 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.897631 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-tmp-dir\") pod \"dns-default-5jw6k\" (UID: \"28d7c952-6b2a-4308-acb9-9864f2a7d6dc\") " pod="openshift-dns/dns-default-5jw6k" Apr 16 19:54:29.897836 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.897667 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls\") pod \"dns-default-5jw6k\" (UID: \"28d7c952-6b2a-4308-acb9-9864f2a7d6dc\") " pod="openshift-dns/dns-default-5jw6k" Apr 16 19:54:29.897836 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.897695 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cr8d4\" (UniqueName: \"kubernetes.io/projected/2a59db72-acc4-42f2-934a-6582522efbbc-kube-api-access-cr8d4\") pod \"ingress-canary-rfcgq\" (UID: \"2a59db72-acc4-42f2-934a-6582522efbbc\") " pod="openshift-ingress-canary/ingress-canary-rfcgq" Apr 16 19:54:29.897836 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.897733 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert\") pod \"ingress-canary-rfcgq\" (UID: \"2a59db72-acc4-42f2-934a-6582522efbbc\") " pod="openshift-ingress-canary/ingress-canary-rfcgq" Apr 16 19:54:29.897836 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:29.897796 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:29.897836 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:29.897819 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:29.898114 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:29.897877 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert podName:2a59db72-acc4-42f2-934a-6582522efbbc nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.397858243 +0000 UTC m=+33.237663635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert") pod "ingress-canary-rfcgq" (UID: "2a59db72-acc4-42f2-934a-6582522efbbc") : secret "canary-serving-cert" not found Apr 16 19:54:29.898114 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:29.897894 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls podName:28d7c952-6b2a-4308-acb9-9864f2a7d6dc nodeName:}" failed. No retries permitted until 2026-04-16 19:54:30.39788567 +0000 UTC m=+33.237691053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls") pod "dns-default-5jw6k" (UID: "28d7c952-6b2a-4308-acb9-9864f2a7d6dc") : secret "dns-default-metrics-tls" not found Apr 16 19:54:29.898114 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.898086 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-tmp-dir\") pod \"dns-default-5jw6k\" (UID: \"28d7c952-6b2a-4308-acb9-9864f2a7d6dc\") " pod="openshift-dns/dns-default-5jw6k" Apr 16 19:54:29.898281 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.898164 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-config-volume\") pod \"dns-default-5jw6k\" (UID: \"28d7c952-6b2a-4308-acb9-9864f2a7d6dc\") " pod="openshift-dns/dns-default-5jw6k" Apr 16 19:54:29.914019 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.913991 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79vc9\" (UniqueName: \"kubernetes.io/projected/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-kube-api-access-79vc9\") pod \"dns-default-5jw6k\" (UID: \"28d7c952-6b2a-4308-acb9-9864f2a7d6dc\") " pod="openshift-dns/dns-default-5jw6k" Apr 16 19:54:29.914230 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:29.914208 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr8d4\" (UniqueName: \"kubernetes.io/projected/2a59db72-acc4-42f2-934a-6582522efbbc-kube-api-access-cr8d4\") pod \"ingress-canary-rfcgq\" (UID: \"2a59db72-acc4-42f2-934a-6582522efbbc\") " pod="openshift-ingress-canary/ingress-canary-rfcgq" Apr 16 19:54:30.301041 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:30.301007 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:30.301218 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:30.301164 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:30.301218 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:30.301180 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6454ccf87c-rjzjd: secret "image-registry-tls" not found Apr 16 19:54:30.301331 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:30.301275 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls podName:2c7aae42-1d5a-4a39-8da1-c98f99b37fd1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:31.301236725 +0000 UTC m=+34.141042120 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls") pod "image-registry-6454ccf87c-rjzjd" (UID: "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1") : secret "image-registry-tls" not found Apr 16 19:54:30.402164 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:30.402114 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls\") pod \"dns-default-5jw6k\" (UID: \"28d7c952-6b2a-4308-acb9-9864f2a7d6dc\") " pod="openshift-dns/dns-default-5jw6k" Apr 16 19:54:30.402357 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:30.402188 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert\") pod \"ingress-canary-rfcgq\" (UID: \"2a59db72-acc4-42f2-934a-6582522efbbc\") " pod="openshift-ingress-canary/ingress-canary-rfcgq" Apr 16 19:54:30.402357 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:30.402319 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:30.402490 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:30.402359 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:30.402490 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:30.402413 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls podName:28d7c952-6b2a-4308-acb9-9864f2a7d6dc nodeName:}" failed. No retries permitted until 2026-04-16 19:54:31.402390378 +0000 UTC m=+34.242195766 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls") pod "dns-default-5jw6k" (UID: "28d7c952-6b2a-4308-acb9-9864f2a7d6dc") : secret "dns-default-metrics-tls" not found Apr 16 19:54:30.402490 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:30.402434 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert podName:2a59db72-acc4-42f2-934a-6582522efbbc nodeName:}" failed. No retries permitted until 2026-04-16 19:54:31.402424463 +0000 UTC m=+34.242229845 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert") pod "ingress-canary-rfcgq" (UID: "2a59db72-acc4-42f2-934a-6582522efbbc") : secret "canary-serving-cert" not found Apr 16 19:54:31.309287 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:31.309230 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:31.309820 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:31.309380 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:31.309820 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:31.309399 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6454ccf87c-rjzjd: secret "image-registry-tls" not found Apr 16 19:54:31.309820 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:31.309463 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls podName:2c7aae42-1d5a-4a39-8da1-c98f99b37fd1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:33.309442326 +0000 UTC m=+36.149247715 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls") pod "image-registry-6454ccf87c-rjzjd" (UID: "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1") : secret "image-registry-tls" not found Apr 16 19:54:31.409858 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:31.409812 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls\") pod \"dns-default-5jw6k\" (UID: \"28d7c952-6b2a-4308-acb9-9864f2a7d6dc\") " pod="openshift-dns/dns-default-5jw6k" Apr 16 19:54:31.410062 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:31.409899 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert\") pod \"ingress-canary-rfcgq\" (UID: \"2a59db72-acc4-42f2-934a-6582522efbbc\") " pod="openshift-ingress-canary/ingress-canary-rfcgq" Apr 16 19:54:31.410062 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:31.409948 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs\") pod \"network-metrics-daemon-vx6n5\" (UID: \"5b25a2cc-2c1d-4c3d-93ff-f73223624d78\") " pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:54:31.410062 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:31.409980 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:31.410062 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:31.410025 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:31.410062 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:31.410054 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls podName:28d7c952-6b2a-4308-acb9-9864f2a7d6dc nodeName:}" failed. No retries permitted until 2026-04-16 19:54:33.410034996 +0000 UTC m=+36.249840376 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls") pod "dns-default-5jw6k" (UID: "28d7c952-6b2a-4308-acb9-9864f2a7d6dc") : secret "dns-default-metrics-tls" not found Apr 16 19:54:31.410344 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:31.410083 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert podName:2a59db72-acc4-42f2-934a-6582522efbbc nodeName:}" failed. No retries permitted until 2026-04-16 19:54:33.410067173 +0000 UTC m=+36.249872552 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert") pod "ingress-canary-rfcgq" (UID: "2a59db72-acc4-42f2-934a-6582522efbbc") : secret "canary-serving-cert" not found Apr 16 19:54:31.410344 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:31.410090 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 19:54:31.410344 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:31.410140 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs podName:5b25a2cc-2c1d-4c3d-93ff-f73223624d78 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:03.410125861 +0000 UTC m=+66.249931248 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs") pod "network-metrics-daemon-vx6n5" (UID: "5b25a2cc-2c1d-4c3d-93ff-f73223624d78") : secret "metrics-daemon-secret" not found Apr 16 19:54:31.510653 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:31.510615 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fbcj\" (UniqueName: \"kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj\") pod \"network-check-target-qq6lx\" (UID: \"10a9f71c-fe60-4983-820b-1a8007ba1863\") " pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:31.513562 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:31.513535 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fbcj\" (UniqueName: \"kubernetes.io/projected/10a9f71c-fe60-4983-820b-1a8007ba1863-kube-api-access-5fbcj\") pod \"network-check-target-qq6lx\" (UID: \"10a9f71c-fe60-4983-820b-1a8007ba1863\") " pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:31.629013 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:31.628929 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:32.792958 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:32.792791 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qq6lx"] Apr 16 19:54:32.881988 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:54:32.881956 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10a9f71c_fe60_4983_820b_1a8007ba1863.slice/crio-39c321ef94c3736bd2758fc3ec89283b2691a239058895975911d2dc8f67de27 WatchSource:0}: Error finding container 39c321ef94c3736bd2758fc3ec89283b2691a239058895975911d2dc8f67de27: Status 404 returned error can't find the container with id 39c321ef94c3736bd2758fc3ec89283b2691a239058895975911d2dc8f67de27 Apr 16 19:54:32.980548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:32.980501 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qq6lx" event={"ID":"10a9f71c-fe60-4983-820b-1a8007ba1863","Type":"ContainerStarted","Data":"39c321ef94c3736bd2758fc3ec89283b2691a239058895975911d2dc8f67de27"} Apr 16 19:54:33.325259 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:33.325214 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:33.325408 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:33.325365 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:33.325408 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:33.325379 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6454ccf87c-rjzjd: secret "image-registry-tls" not found Apr 16 19:54:33.325504 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:33.325434 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls podName:2c7aae42-1d5a-4a39-8da1-c98f99b37fd1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:37.325417166 +0000 UTC m=+40.165222552 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls") pod "image-registry-6454ccf87c-rjzjd" (UID: "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1") : secret "image-registry-tls" not found Apr 16 19:54:33.426226 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:33.426139 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls\") pod \"dns-default-5jw6k\" (UID: \"28d7c952-6b2a-4308-acb9-9864f2a7d6dc\") " pod="openshift-dns/dns-default-5jw6k" Apr 16 19:54:33.426401 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:33.426297 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert\") pod \"ingress-canary-rfcgq\" (UID: \"2a59db72-acc4-42f2-934a-6582522efbbc\") " pod="openshift-ingress-canary/ingress-canary-rfcgq" Apr 16 19:54:33.426401 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:33.426325 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:33.426401 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:33.426400 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls podName:28d7c952-6b2a-4308-acb9-9864f2a7d6dc nodeName:}" failed. No retries permitted until 2026-04-16 19:54:37.426379419 +0000 UTC m=+40.266184803 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls") pod "dns-default-5jw6k" (UID: "28d7c952-6b2a-4308-acb9-9864f2a7d6dc") : secret "dns-default-metrics-tls" not found Apr 16 19:54:33.426523 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:33.426428 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:33.427182 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:33.426860 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert podName:2a59db72-acc4-42f2-934a-6582522efbbc nodeName:}" failed. No retries permitted until 2026-04-16 19:54:37.426826753 +0000 UTC m=+40.266632148 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert") pod "ingress-canary-rfcgq" (UID: "2a59db72-acc4-42f2-934a-6582522efbbc") : secret "canary-serving-cert" not found Apr 16 19:54:33.986195 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:33.986155 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w96qg" event={"ID":"d12d0814-9c34-4bb1-b975-721e0ecd4752","Type":"ContainerDied","Data":"8f68b86a3c15f929a748d1d86a54044bd163f46c13a9e42f8667e924a75025a4"} Apr 16 19:54:33.986195 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:33.986074 2561 generic.go:358] "Generic (PLEG): container finished" podID="d12d0814-9c34-4bb1-b975-721e0ecd4752" containerID="8f68b86a3c15f929a748d1d86a54044bd163f46c13a9e42f8667e924a75025a4" exitCode=0 Apr 16 19:54:34.991111 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:34.991080 2561 generic.go:358] "Generic (PLEG): container finished" podID="d12d0814-9c34-4bb1-b975-721e0ecd4752" containerID="d3bcbd44a3362419531794ad5d26921228a9debb789902bd4625f4cc4c396c95" exitCode=0 Apr 16 19:54:34.991538 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:34.991142 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w96qg" event={"ID":"d12d0814-9c34-4bb1-b975-721e0ecd4752","Type":"ContainerDied","Data":"d3bcbd44a3362419531794ad5d26921228a9debb789902bd4625f4cc4c396c95"} Apr 16 19:54:35.994565 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:35.994258 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qq6lx" event={"ID":"10a9f71c-fe60-4983-820b-1a8007ba1863","Type":"ContainerStarted","Data":"b6950001ffcc5d3ad9f3f2c6108b114c01ba21bab02bfef7d6a3d243f6680144"} Apr 16 19:54:35.994565 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:35.994503 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:54:35.997070 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:35.997049 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w96qg" event={"ID":"d12d0814-9c34-4bb1-b975-721e0ecd4752","Type":"ContainerStarted","Data":"9e468673e5819c822947b139233b42d5e0a3c03e379fa590d5dd6ee9b9bcc002"} Apr 16 19:54:36.010600 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:36.010537 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qq6lx" podStartSLOduration=36.189416857 podStartE2EDuration="39.010523263s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:54:32.890315181 +0000 UTC m=+35.730120560" lastFinishedPulling="2026-04-16 19:54:35.711421578 +0000 UTC m=+38.551226966" observedRunningTime="2026-04-16 19:54:36.0098944 +0000 UTC m=+38.849699800" watchObservedRunningTime="2026-04-16 19:54:36.010523263 +0000 UTC m=+38.850328660" Apr 16 19:54:36.034099 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:36.034046 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-w96qg" podStartSLOduration=6.458889613 podStartE2EDuration="39.03403405s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:54:00.336956207 +0000 UTC m=+3.176761587" lastFinishedPulling="2026-04-16 19:54:32.912100645 +0000 UTC m=+35.751906024" observedRunningTime="2026-04-16 19:54:36.031874615 +0000 UTC m=+38.871680015" watchObservedRunningTime="2026-04-16 19:54:36.03403405 +0000 UTC m=+38.873839451" Apr 16 19:54:37.356761 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:37.356709 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:37.357128 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:37.356852 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:37.357128 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:37.356870 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6454ccf87c-rjzjd: secret "image-registry-tls" not found Apr 16 19:54:37.357128 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:37.356924 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls podName:2c7aae42-1d5a-4a39-8da1-c98f99b37fd1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:45.35690864 +0000 UTC m=+48.196714022 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls") pod "image-registry-6454ccf87c-rjzjd" (UID: "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1") : secret "image-registry-tls" not found Apr 16 19:54:37.457295 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:37.457243 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls\") pod \"dns-default-5jw6k\" (UID: \"28d7c952-6b2a-4308-acb9-9864f2a7d6dc\") " pod="openshift-dns/dns-default-5jw6k" Apr 16 19:54:37.457427 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:37.457310 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert\") pod \"ingress-canary-rfcgq\" (UID: \"2a59db72-acc4-42f2-934a-6582522efbbc\") " pod="openshift-ingress-canary/ingress-canary-rfcgq" Apr 16 19:54:37.457427 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:37.457387 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:37.457427 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:37.457417 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:37.457515 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:37.457447 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls podName:28d7c952-6b2a-4308-acb9-9864f2a7d6dc nodeName:}" failed. No retries permitted until 2026-04-16 19:54:45.457432364 +0000 UTC m=+48.297237749 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls") pod "dns-default-5jw6k" (UID: "28d7c952-6b2a-4308-acb9-9864f2a7d6dc") : secret "dns-default-metrics-tls" not found Apr 16 19:54:37.457515 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:37.457460 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert podName:2a59db72-acc4-42f2-934a-6582522efbbc nodeName:}" failed. No retries permitted until 2026-04-16 19:54:45.457454604 +0000 UTC m=+48.297259984 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert") pod "ingress-canary-rfcgq" (UID: "2a59db72-acc4-42f2-934a-6582522efbbc") : secret "canary-serving-cert" not found Apr 16 19:54:38.135843 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:38.135804 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-w52n5"] Apr 16 19:54:38.138845 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:38.138822 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w52n5" Apr 16 19:54:38.142356 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:38.142332 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:38.142458 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:38.142357 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-plsk6\"" Apr 16 19:54:38.143322 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:38.143308 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 19:54:38.147415 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:38.147392 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-w52n5"] Apr 16 19:54:38.263483 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:38.263448 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2wjp\" (UniqueName: \"kubernetes.io/projected/c0fa525c-6b9b-4e3d-98b7-fd18c7a4c4bc-kube-api-access-m2wjp\") pod \"migrator-74bb7799d9-w52n5\" (UID: \"c0fa525c-6b9b-4e3d-98b7-fd18c7a4c4bc\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w52n5" Apr 16 19:54:38.364114 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:38.364082 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2wjp\" (UniqueName: \"kubernetes.io/projected/c0fa525c-6b9b-4e3d-98b7-fd18c7a4c4bc-kube-api-access-m2wjp\") pod \"migrator-74bb7799d9-w52n5\" (UID: \"c0fa525c-6b9b-4e3d-98b7-fd18c7a4c4bc\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w52n5" Apr 16 19:54:38.374093 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:38.374071 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2wjp\" (UniqueName: \"kubernetes.io/projected/c0fa525c-6b9b-4e3d-98b7-fd18c7a4c4bc-kube-api-access-m2wjp\") pod \"migrator-74bb7799d9-w52n5\" (UID: \"c0fa525c-6b9b-4e3d-98b7-fd18c7a4c4bc\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w52n5" Apr 16 19:54:38.448771 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:38.448703 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w52n5" Apr 16 19:54:38.560802 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:38.560772 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-w52n5"] Apr 16 19:54:38.567684 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:54:38.565393 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0fa525c_6b9b_4e3d_98b7_fd18c7a4c4bc.slice/crio-12cab6c1d44dda349486c161a8396ef86d9e99b506ac4633a74d435ee79b5571 WatchSource:0}: Error finding container 12cab6c1d44dda349486c161a8396ef86d9e99b506ac4633a74d435ee79b5571: Status 404 returned error can't find the container with id 12cab6c1d44dda349486c161a8396ef86d9e99b506ac4633a74d435ee79b5571 Apr 16 19:54:39.003516 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.003483 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w52n5" event={"ID":"c0fa525c-6b9b-4e3d-98b7-fd18c7a4c4bc","Type":"ContainerStarted","Data":"12cab6c1d44dda349486c161a8396ef86d9e99b506ac4633a74d435ee79b5571"} Apr 16 19:54:39.372984 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.372943 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret\") pod \"global-pull-secret-syncer-z89qz\" (UID: \"3ada28f2-9643-4885-b86d-53b74a05e6a5\") " pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:39.375509 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.375477 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3ada28f2-9643-4885-b86d-53b74a05e6a5-original-pull-secret\") pod \"global-pull-secret-syncer-z89qz\" (UID: \"3ada28f2-9643-4885-b86d-53b74a05e6a5\") " pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:39.383861 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.383838 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-nvdgg"] Apr 16 19:54:39.387760 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.387740 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-nvdgg" Apr 16 19:54:39.390583 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.390563 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 19:54:39.390668 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.390598 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 19:54:39.390668 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.390563 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 19:54:39.391772 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.391753 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-q6xdv\"" Apr 16 19:54:39.391877 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.391793 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 19:54:39.396469 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.396448 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-nvdgg"] Apr 16 19:54:39.434396 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.434359 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z89qz" Apr 16 19:54:39.461420 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.461398 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-xlktd_487cb337-b28d-43b2-8430-bacf43a71449/dns-node-resolver/0.log" Apr 16 19:54:39.574695 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.574655 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b02238b2-6784-467e-a19c-e6d1889c489e-signing-key\") pod \"service-ca-865cb79987-nvdgg\" (UID: \"b02238b2-6784-467e-a19c-e6d1889c489e\") " pod="openshift-service-ca/service-ca-865cb79987-nvdgg" Apr 16 19:54:39.574695 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.574686 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b02238b2-6784-467e-a19c-e6d1889c489e-signing-cabundle\") pod \"service-ca-865cb79987-nvdgg\" (UID: \"b02238b2-6784-467e-a19c-e6d1889c489e\") " pod="openshift-service-ca/service-ca-865cb79987-nvdgg" Apr 16 19:54:39.574887 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.574713 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc8sg\" (UniqueName: \"kubernetes.io/projected/b02238b2-6784-467e-a19c-e6d1889c489e-kube-api-access-pc8sg\") pod \"service-ca-865cb79987-nvdgg\" (UID: \"b02238b2-6784-467e-a19c-e6d1889c489e\") " pod="openshift-service-ca/service-ca-865cb79987-nvdgg" Apr 16 19:54:39.656543 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.656518 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-z89qz"] Apr 16 19:54:39.659702 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:54:39.659664 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ada28f2_9643_4885_b86d_53b74a05e6a5.slice/crio-b5848846b4d0d123b5308a96622d55fd63b86662f8ba142b58782dda0b43e5c2 WatchSource:0}: Error finding container b5848846b4d0d123b5308a96622d55fd63b86662f8ba142b58782dda0b43e5c2: Status 404 returned error can't find the container with id b5848846b4d0d123b5308a96622d55fd63b86662f8ba142b58782dda0b43e5c2 Apr 16 19:54:39.675928 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.675905 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b02238b2-6784-467e-a19c-e6d1889c489e-signing-key\") pod \"service-ca-865cb79987-nvdgg\" (UID: \"b02238b2-6784-467e-a19c-e6d1889c489e\") " pod="openshift-service-ca/service-ca-865cb79987-nvdgg" Apr 16 19:54:39.676014 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.675940 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b02238b2-6784-467e-a19c-e6d1889c489e-signing-cabundle\") pod \"service-ca-865cb79987-nvdgg\" (UID: \"b02238b2-6784-467e-a19c-e6d1889c489e\") " pod="openshift-service-ca/service-ca-865cb79987-nvdgg" Apr 16 19:54:39.676014 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.675983 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pc8sg\" (UniqueName: \"kubernetes.io/projected/b02238b2-6784-467e-a19c-e6d1889c489e-kube-api-access-pc8sg\") pod \"service-ca-865cb79987-nvdgg\" (UID: \"b02238b2-6784-467e-a19c-e6d1889c489e\") " pod="openshift-service-ca/service-ca-865cb79987-nvdgg" Apr 16 19:54:39.676827 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.676804 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b02238b2-6784-467e-a19c-e6d1889c489e-signing-cabundle\") pod \"service-ca-865cb79987-nvdgg\" (UID: \"b02238b2-6784-467e-a19c-e6d1889c489e\") " pod="openshift-service-ca/service-ca-865cb79987-nvdgg" Apr 16 19:54:39.678515 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.678494 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b02238b2-6784-467e-a19c-e6d1889c489e-signing-key\") pod \"service-ca-865cb79987-nvdgg\" (UID: \"b02238b2-6784-467e-a19c-e6d1889c489e\") " pod="openshift-service-ca/service-ca-865cb79987-nvdgg" Apr 16 19:54:39.684820 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.684793 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc8sg\" (UniqueName: \"kubernetes.io/projected/b02238b2-6784-467e-a19c-e6d1889c489e-kube-api-access-pc8sg\") pod \"service-ca-865cb79987-nvdgg\" (UID: \"b02238b2-6784-467e-a19c-e6d1889c489e\") " pod="openshift-service-ca/service-ca-865cb79987-nvdgg" Apr 16 19:54:39.699010 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.698986 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-nvdgg" Apr 16 19:54:39.812892 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:39.812862 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-nvdgg"] Apr 16 19:54:39.815990 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:54:39.815958 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb02238b2_6784_467e_a19c_e6d1889c489e.slice/crio-4121bc112952e53875f55c557aa3136d47be829675aee8b1a0f1a99840c7b6dd WatchSource:0}: Error finding container 4121bc112952e53875f55c557aa3136d47be829675aee8b1a0f1a99840c7b6dd: Status 404 returned error can't find the container with id 4121bc112952e53875f55c557aa3136d47be829675aee8b1a0f1a99840c7b6dd Apr 16 19:54:40.007034 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:40.006950 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-z89qz" event={"ID":"3ada28f2-9643-4885-b86d-53b74a05e6a5","Type":"ContainerStarted","Data":"b5848846b4d0d123b5308a96622d55fd63b86662f8ba142b58782dda0b43e5c2"} Apr 16 19:54:40.008880 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:40.008851 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w52n5" event={"ID":"c0fa525c-6b9b-4e3d-98b7-fd18c7a4c4bc","Type":"ContainerStarted","Data":"d6aac80917fd52aaba6f670f5d4a5a546d4d78b64b860b0aae1f2abb77e7fe25"} Apr 16 19:54:40.008880 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:40.008886 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w52n5" event={"ID":"c0fa525c-6b9b-4e3d-98b7-fd18c7a4c4bc","Type":"ContainerStarted","Data":"f23b858a2318db56172c981ec7602b040ab834f278d888843e3b95935db1700d"} Apr 16 19:54:40.010197 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:40.010170 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-nvdgg" event={"ID":"b02238b2-6784-467e-a19c-e6d1889c489e","Type":"ContainerStarted","Data":"4121bc112952e53875f55c557aa3136d47be829675aee8b1a0f1a99840c7b6dd"} Apr 16 19:54:40.028731 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:40.028685 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w52n5" podStartSLOduration=0.999916358 podStartE2EDuration="2.028669237s" podCreationTimestamp="2026-04-16 19:54:38 +0000 UTC" firstStartedPulling="2026-04-16 19:54:38.569061166 +0000 UTC m=+41.408866545" lastFinishedPulling="2026-04-16 19:54:39.59781403 +0000 UTC m=+42.437619424" observedRunningTime="2026-04-16 19:54:40.027758309 +0000 UTC m=+42.867563710" watchObservedRunningTime="2026-04-16 19:54:40.028669237 +0000 UTC m=+42.868474638" Apr 16 19:54:40.262665 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:40.262586 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kzf5d_f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55/node-ca/0.log" Apr 16 19:54:42.015904 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:42.015829 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-nvdgg" event={"ID":"b02238b2-6784-467e-a19c-e6d1889c489e","Type":"ContainerStarted","Data":"0e78ccf6c77ff6948b690bbd46cc74b8c638af032eccdea1c240d512089a2d57"} Apr 16 19:54:42.035238 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:42.035189 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-nvdgg" podStartSLOduration=1.118322095 podStartE2EDuration="3.035172557s" podCreationTimestamp="2026-04-16 19:54:39 +0000 UTC" firstStartedPulling="2026-04-16 19:54:39.817741474 +0000 UTC m=+42.657546853" lastFinishedPulling="2026-04-16 19:54:41.734591932 +0000 UTC m=+44.574397315" observedRunningTime="2026-04-16 19:54:42.033687406 +0000 UTC m=+44.873492831" watchObservedRunningTime="2026-04-16 19:54:42.035172557 +0000 UTC m=+44.874977970" Apr 16 19:54:44.022112 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:44.022076 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-z89qz" event={"ID":"3ada28f2-9643-4885-b86d-53b74a05e6a5","Type":"ContainerStarted","Data":"7e72a65be3123ad2196f724b773f97b9cf952f0a2599da6d67e4386c42854dda"} Apr 16 19:54:44.043087 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:44.043035 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-z89qz" podStartSLOduration=33.435805543 podStartE2EDuration="37.043015869s" podCreationTimestamp="2026-04-16 19:54:07 +0000 UTC" firstStartedPulling="2026-04-16 19:54:39.661327324 +0000 UTC m=+42.501132703" lastFinishedPulling="2026-04-16 19:54:43.268537635 +0000 UTC m=+46.108343029" observedRunningTime="2026-04-16 19:54:44.041334396 +0000 UTC m=+46.881139803" watchObservedRunningTime="2026-04-16 19:54:44.043015869 +0000 UTC m=+46.882821276" Apr 16 19:54:45.422735 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:45.422686 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:54:45.423210 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:45.422857 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:45.423210 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:45.422882 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6454ccf87c-rjzjd: secret "image-registry-tls" not found Apr 16 19:54:45.423210 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:45.422955 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls podName:2c7aae42-1d5a-4a39-8da1-c98f99b37fd1 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:01.422933478 +0000 UTC m=+64.262738857 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls") pod "image-registry-6454ccf87c-rjzjd" (UID: "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1") : secret "image-registry-tls" not found Apr 16 19:54:45.523519 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:45.523473 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert\") pod \"ingress-canary-rfcgq\" (UID: \"2a59db72-acc4-42f2-934a-6582522efbbc\") " pod="openshift-ingress-canary/ingress-canary-rfcgq" Apr 16 19:54:45.523721 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:45.523591 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls\") pod \"dns-default-5jw6k\" (UID: \"28d7c952-6b2a-4308-acb9-9864f2a7d6dc\") " pod="openshift-dns/dns-default-5jw6k" Apr 16 19:54:45.523721 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:45.523616 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:45.523721 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:45.523676 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert podName:2a59db72-acc4-42f2-934a-6582522efbbc nodeName:}" failed. No retries permitted until 2026-04-16 19:55:01.523660939 +0000 UTC m=+64.363466318 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert") pod "ingress-canary-rfcgq" (UID: "2a59db72-acc4-42f2-934a-6582522efbbc") : secret "canary-serving-cert" not found Apr 16 19:54:45.523721 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:45.523701 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:45.523879 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:54:45.523749 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls podName:28d7c952-6b2a-4308-acb9-9864f2a7d6dc nodeName:}" failed. No retries permitted until 2026-04-16 19:55:01.523735283 +0000 UTC m=+64.363540662 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls") pod "dns-default-5jw6k" (UID: "28d7c952-6b2a-4308-acb9-9864f2a7d6dc") : secret "dns-default-metrics-tls" not found Apr 16 19:54:54.973988 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:54:54.973959 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2px7v" Apr 16 19:55:01.450510 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:01.450462 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:55:01.452845 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:01.452824 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls\") pod \"image-registry-6454ccf87c-rjzjd\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:55:01.551222 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:01.551180 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls\") pod \"dns-default-5jw6k\" (UID: \"28d7c952-6b2a-4308-acb9-9864f2a7d6dc\") " pod="openshift-dns/dns-default-5jw6k" Apr 16 19:55:01.551431 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:01.551233 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert\") pod \"ingress-canary-rfcgq\" (UID: \"2a59db72-acc4-42f2-934a-6582522efbbc\") " pod="openshift-ingress-canary/ingress-canary-rfcgq" Apr 16 19:55:01.553543 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:01.553513 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/28d7c952-6b2a-4308-acb9-9864f2a7d6dc-metrics-tls\") pod \"dns-default-5jw6k\" (UID: \"28d7c952-6b2a-4308-acb9-9864f2a7d6dc\") " pod="openshift-dns/dns-default-5jw6k" Apr 16 19:55:01.553756 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:01.553736 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a59db72-acc4-42f2-934a-6582522efbbc-cert\") pod \"ingress-canary-rfcgq\" (UID: \"2a59db72-acc4-42f2-934a-6582522efbbc\") " pod="openshift-ingress-canary/ingress-canary-rfcgq" Apr 16 19:55:01.732565 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:01.732489 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lq4lt\"" Apr 16 19:55:01.740212 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:01.740191 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:55:01.745817 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:01.745785 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-668h7\"" Apr 16 19:55:01.753443 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:01.753417 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5jw6k" Apr 16 19:55:01.772933 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:01.772905 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xfkbf\"" Apr 16 19:55:01.778817 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:01.778735 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rfcgq" Apr 16 19:55:01.885900 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:01.885869 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6454ccf87c-rjzjd"] Apr 16 19:55:01.896816 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:55:01.896780 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c7aae42_1d5a_4a39_8da1_c98f99b37fd1.slice/crio-a651b8bceecdd6471fe7d8e7e51501977da5c7cecc012bc814bb4eaa1efcb961 WatchSource:0}: Error finding container a651b8bceecdd6471fe7d8e7e51501977da5c7cecc012bc814bb4eaa1efcb961: Status 404 returned error can't find the container with id a651b8bceecdd6471fe7d8e7e51501977da5c7cecc012bc814bb4eaa1efcb961 Apr 16 19:55:01.903624 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:01.903599 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5jw6k"] Apr 16 19:55:01.906223 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:55:01.906201 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28d7c952_6b2a_4308_acb9_9864f2a7d6dc.slice/crio-9a7b6337668f5c615d9394b36d95c512858704bd0775accf61858001908f6edb WatchSource:0}: Error finding container 9a7b6337668f5c615d9394b36d95c512858704bd0775accf61858001908f6edb: Status 404 returned error can't find the container with id 9a7b6337668f5c615d9394b36d95c512858704bd0775accf61858001908f6edb Apr 16 19:55:01.927463 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:01.927438 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rfcgq"] Apr 16 19:55:01.930497 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:55:01.930474 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a59db72_acc4_42f2_934a_6582522efbbc.slice/crio-be3c6be7413a5a630c58e6ed134122b23c0e926d3b09e4f95e833d36a03b72bd WatchSource:0}: Error finding container be3c6be7413a5a630c58e6ed134122b23c0e926d3b09e4f95e833d36a03b72bd: Status 404 returned error can't find the container with id be3c6be7413a5a630c58e6ed134122b23c0e926d3b09e4f95e833d36a03b72bd Apr 16 19:55:02.062273 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:02.062227 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rfcgq" event={"ID":"2a59db72-acc4-42f2-934a-6582522efbbc","Type":"ContainerStarted","Data":"be3c6be7413a5a630c58e6ed134122b23c0e926d3b09e4f95e833d36a03b72bd"} Apr 16 19:55:02.063357 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:02.063332 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5jw6k" event={"ID":"28d7c952-6b2a-4308-acb9-9864f2a7d6dc","Type":"ContainerStarted","Data":"9a7b6337668f5c615d9394b36d95c512858704bd0775accf61858001908f6edb"} Apr 16 19:55:02.064583 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:02.064554 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" event={"ID":"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1","Type":"ContainerStarted","Data":"9f8de30969a5f58b1e46a84ee24db014fcdaba3c374f828cb584541a5d0bfe8f"} Apr 16 19:55:02.064583 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:02.064585 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" event={"ID":"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1","Type":"ContainerStarted","Data":"a651b8bceecdd6471fe7d8e7e51501977da5c7cecc012bc814bb4eaa1efcb961"} Apr 16 19:55:02.064762 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:02.064684 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:55:02.091289 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:02.089357 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" podStartSLOduration=57.089339952 podStartE2EDuration="57.089339952s" podCreationTimestamp="2026-04-16 19:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:55:02.088341397 +0000 UTC m=+64.928146802" watchObservedRunningTime="2026-04-16 19:55:02.089339952 +0000 UTC m=+64.929145353" Apr 16 19:55:03.173245 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.173214 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6454ccf87c-rjzjd"] Apr 16 19:55:03.354067 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.354029 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-69647495b-szrmb"] Apr 16 19:55:03.357303 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.357275 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.374607 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.374550 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69647495b-szrmb"] Apr 16 19:55:03.375559 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.375521 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-x64wb"] Apr 16 19:55:03.378735 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.378708 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-x64wb" Apr 16 19:55:03.381821 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.381763 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-pxnkg\"" Apr 16 19:55:03.381967 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.381943 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 19:55:03.382231 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.382215 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 19:55:03.382522 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.382490 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 19:55:03.385726 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.385707 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 19:55:03.393999 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.393980 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-x64wb"] Apr 16 19:55:03.464933 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.464839 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43164e36-45f6-4070-9897-95da9982dd10-registry-tls\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.464933 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.464879 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43164e36-45f6-4070-9897-95da9982dd10-ca-trust-extracted\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.464933 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.464899 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b2af1db9-97e2-4754-951b-1299f4f7a507-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x64wb\" (UID: \"b2af1db9-97e2-4754-951b-1299f4f7a507\") " pod="openshift-insights/insights-runtime-extractor-x64wb" Apr 16 19:55:03.464933 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.464928 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2af1db9-97e2-4754-951b-1299f4f7a507-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x64wb\" (UID: \"b2af1db9-97e2-4754-951b-1299f4f7a507\") " pod="openshift-insights/insights-runtime-extractor-x64wb" Apr 16 19:55:03.465214 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.464966 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43164e36-45f6-4070-9897-95da9982dd10-registry-certificates\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.465214 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.464993 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43164e36-45f6-4070-9897-95da9982dd10-installation-pull-secrets\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.465214 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.465022 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43164e36-45f6-4070-9897-95da9982dd10-trusted-ca\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.465214 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.465060 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqzqv\" (UniqueName: \"kubernetes.io/projected/b2af1db9-97e2-4754-951b-1299f4f7a507-kube-api-access-sqzqv\") pod \"insights-runtime-extractor-x64wb\" (UID: \"b2af1db9-97e2-4754-951b-1299f4f7a507\") " pod="openshift-insights/insights-runtime-extractor-x64wb" Apr 16 19:55:03.465214 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.465081 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gknmn\" (UniqueName: \"kubernetes.io/projected/43164e36-45f6-4070-9897-95da9982dd10-kube-api-access-gknmn\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.465214 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.465099 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/43164e36-45f6-4070-9897-95da9982dd10-image-registry-private-configuration\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.465214 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.465123 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43164e36-45f6-4070-9897-95da9982dd10-bound-sa-token\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.465214 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.465204 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b2af1db9-97e2-4754-951b-1299f4f7a507-crio-socket\") pod \"insights-runtime-extractor-x64wb\" (UID: \"b2af1db9-97e2-4754-951b-1299f4f7a507\") " pod="openshift-insights/insights-runtime-extractor-x64wb" Apr 16 19:55:03.465595 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.465239 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b2af1db9-97e2-4754-951b-1299f4f7a507-data-volume\") pod \"insights-runtime-extractor-x64wb\" (UID: \"b2af1db9-97e2-4754-951b-1299f4f7a507\") " pod="openshift-insights/insights-runtime-extractor-x64wb" Apr 16 19:55:03.465595 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.465328 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs\") pod \"network-metrics-daemon-vx6n5\" (UID: \"5b25a2cc-2c1d-4c3d-93ff-f73223624d78\") " pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:55:03.467960 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.467932 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b25a2cc-2c1d-4c3d-93ff-f73223624d78-metrics-certs\") pod \"network-metrics-daemon-vx6n5\" (UID: \"5b25a2cc-2c1d-4c3d-93ff-f73223624d78\") " pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:55:03.566456 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.566423 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43164e36-45f6-4070-9897-95da9982dd10-registry-tls\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.566623 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.566464 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43164e36-45f6-4070-9897-95da9982dd10-ca-trust-extracted\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.566623 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.566492 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b2af1db9-97e2-4754-951b-1299f4f7a507-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x64wb\" (UID: \"b2af1db9-97e2-4754-951b-1299f4f7a507\") " pod="openshift-insights/insights-runtime-extractor-x64wb" Apr 16 19:55:03.566623 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.566521 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2af1db9-97e2-4754-951b-1299f4f7a507-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x64wb\" (UID: \"b2af1db9-97e2-4754-951b-1299f4f7a507\") " pod="openshift-insights/insights-runtime-extractor-x64wb" Apr 16 19:55:03.566623 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.566558 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43164e36-45f6-4070-9897-95da9982dd10-registry-certificates\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.566623 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.566584 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43164e36-45f6-4070-9897-95da9982dd10-installation-pull-secrets\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.566872 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.566729 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43164e36-45f6-4070-9897-95da9982dd10-trusted-ca\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.566872 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.566768 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqzqv\" (UniqueName: \"kubernetes.io/projected/b2af1db9-97e2-4754-951b-1299f4f7a507-kube-api-access-sqzqv\") pod \"insights-runtime-extractor-x64wb\" (UID: \"b2af1db9-97e2-4754-951b-1299f4f7a507\") " pod="openshift-insights/insights-runtime-extractor-x64wb" Apr 16 19:55:03.566872 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.566801 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gknmn\" (UniqueName: \"kubernetes.io/projected/43164e36-45f6-4070-9897-95da9982dd10-kube-api-access-gknmn\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.566872 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.566833 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/43164e36-45f6-4070-9897-95da9982dd10-image-registry-private-configuration\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.566872 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.566861 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43164e36-45f6-4070-9897-95da9982dd10-bound-sa-token\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.567080 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.566893 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b2af1db9-97e2-4754-951b-1299f4f7a507-crio-socket\") pod \"insights-runtime-extractor-x64wb\" (UID: \"b2af1db9-97e2-4754-951b-1299f4f7a507\") " pod="openshift-insights/insights-runtime-extractor-x64wb" Apr 16 19:55:03.567080 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.566909 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b2af1db9-97e2-4754-951b-1299f4f7a507-data-volume\") pod \"insights-runtime-extractor-x64wb\" (UID: \"b2af1db9-97e2-4754-951b-1299f4f7a507\") " pod="openshift-insights/insights-runtime-extractor-x64wb" Apr 16 19:55:03.567080 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.567031 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b2af1db9-97e2-4754-951b-1299f4f7a507-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x64wb\" (UID: \"b2af1db9-97e2-4754-951b-1299f4f7a507\") " pod="openshift-insights/insights-runtime-extractor-x64wb" Apr 16 19:55:03.567580 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.567294 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b2af1db9-97e2-4754-951b-1299f4f7a507-data-volume\") pod \"insights-runtime-extractor-x64wb\" (UID: \"b2af1db9-97e2-4754-951b-1299f4f7a507\") " pod="openshift-insights/insights-runtime-extractor-x64wb" Apr 16 19:55:03.567580 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.567293 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b2af1db9-97e2-4754-951b-1299f4f7a507-crio-socket\") pod \"insights-runtime-extractor-x64wb\" (UID: \"b2af1db9-97e2-4754-951b-1299f4f7a507\") " pod="openshift-insights/insights-runtime-extractor-x64wb" Apr 16 19:55:03.567580 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.567533 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/43164e36-45f6-4070-9897-95da9982dd10-ca-trust-extracted\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.568112 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.568088 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/43164e36-45f6-4070-9897-95da9982dd10-registry-certificates\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.568229 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.568088 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/43164e36-45f6-4070-9897-95da9982dd10-trusted-ca\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.569360 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.569333 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2af1db9-97e2-4754-951b-1299f4f7a507-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x64wb\" (UID: \"b2af1db9-97e2-4754-951b-1299f4f7a507\") " pod="openshift-insights/insights-runtime-extractor-x64wb" Apr 16 19:55:03.569995 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.569977 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/43164e36-45f6-4070-9897-95da9982dd10-image-registry-private-configuration\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.570116 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.570097 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/43164e36-45f6-4070-9897-95da9982dd10-installation-pull-secrets\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.570162 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.570146 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/43164e36-45f6-4070-9897-95da9982dd10-registry-tls\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.583181 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.583154 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqzqv\" (UniqueName: \"kubernetes.io/projected/b2af1db9-97e2-4754-951b-1299f4f7a507-kube-api-access-sqzqv\") pod \"insights-runtime-extractor-x64wb\" (UID: \"b2af1db9-97e2-4754-951b-1299f4f7a507\") " pod="openshift-insights/insights-runtime-extractor-x64wb" Apr 16 19:55:03.591556 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.591535 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gknmn\" (UniqueName: \"kubernetes.io/projected/43164e36-45f6-4070-9897-95da9982dd10-kube-api-access-gknmn\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.592104 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.592077 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/43164e36-45f6-4070-9897-95da9982dd10-bound-sa-token\") pod \"image-registry-69647495b-szrmb\" (UID: \"43164e36-45f6-4070-9897-95da9982dd10\") " pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.668264 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.668220 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:03.689985 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.689958 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-x64wb" Apr 16 19:55:03.724551 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.724472 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nnp77\"" Apr 16 19:55:03.733019 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.732237 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vx6n5" Apr 16 19:55:03.961917 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.961885 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69647495b-szrmb"] Apr 16 19:55:03.965542 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:55:03.965512 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43164e36_45f6_4070_9897_95da9982dd10.slice/crio-b0e3a756940f27d6c7b4402bcd15dbf97314f66e5529ddb8c449815c6bb58931 WatchSource:0}: Error finding container b0e3a756940f27d6c7b4402bcd15dbf97314f66e5529ddb8c449815c6bb58931: Status 404 returned error can't find the container with id b0e3a756940f27d6c7b4402bcd15dbf97314f66e5529ddb8c449815c6bb58931 Apr 16 19:55:03.966685 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.966657 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-x64wb"] Apr 16 19:55:03.968500 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:55:03.968477 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2af1db9_97e2_4754_951b_1299f4f7a507.slice/crio-27a0b89c09c42fd22ad4e1925804362e00857e62e2ae4caedc65fc2b851526d0 WatchSource:0}: Error finding container 27a0b89c09c42fd22ad4e1925804362e00857e62e2ae4caedc65fc2b851526d0: Status 404 returned error can't find the container with id 27a0b89c09c42fd22ad4e1925804362e00857e62e2ae4caedc65fc2b851526d0 Apr 16 19:55:03.975827 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:03.975768 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vx6n5"] Apr 16 19:55:03.980457 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:55:03.980435 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b25a2cc_2c1d_4c3d_93ff_f73223624d78.slice/crio-329520f3d913ca6e657bb1c43791163cf5f8e511333f9de27498bda96838d883 WatchSource:0}: Error finding container 329520f3d913ca6e657bb1c43791163cf5f8e511333f9de27498bda96838d883: Status 404 returned error can't find the container with id 329520f3d913ca6e657bb1c43791163cf5f8e511333f9de27498bda96838d883 Apr 16 19:55:04.076419 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:04.076373 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vx6n5" event={"ID":"5b25a2cc-2c1d-4c3d-93ff-f73223624d78","Type":"ContainerStarted","Data":"329520f3d913ca6e657bb1c43791163cf5f8e511333f9de27498bda96838d883"} Apr 16 19:55:04.077811 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:04.077781 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x64wb" event={"ID":"b2af1db9-97e2-4754-951b-1299f4f7a507","Type":"ContainerStarted","Data":"be99e63ca16c7f77dfae7d3aa9bc3a8332e6dc74ed89a67852a4ade40e49d91b"} Apr 16 19:55:04.077930 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:04.077818 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x64wb" event={"ID":"b2af1db9-97e2-4754-951b-1299f4f7a507","Type":"ContainerStarted","Data":"27a0b89c09c42fd22ad4e1925804362e00857e62e2ae4caedc65fc2b851526d0"} Apr 16 19:55:04.079217 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:04.079187 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rfcgq" event={"ID":"2a59db72-acc4-42f2-934a-6582522efbbc","Type":"ContainerStarted","Data":"4ec277e321fd5035a8be8b4234e477b715a598ab18d978d8d8be3c4057fa9c8b"} Apr 16 19:55:04.080841 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:04.080808 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5jw6k" event={"ID":"28d7c952-6b2a-4308-acb9-9864f2a7d6dc","Type":"ContainerStarted","Data":"ed7feff34e13cd593ec797b26b9ceb0ce91fa2e3260eae54ca335440953b5b7e"} Apr 16 19:55:04.080841 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:04.080837 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5jw6k" event={"ID":"28d7c952-6b2a-4308-acb9-9864f2a7d6dc","Type":"ContainerStarted","Data":"e30bb86d7a7446d5a002914361440cd456ff0dd44d71c7317ddbf40589c6fe83"} Apr 16 19:55:04.080990 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:04.080947 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-5jw6k" Apr 16 19:55:04.082208 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:04.082184 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69647495b-szrmb" event={"ID":"43164e36-45f6-4070-9897-95da9982dd10","Type":"ContainerStarted","Data":"a9eab2d9cacbbb791e27bc717bbc0517b9f972edfef8cd309e7c98c30f22735a"} Apr 16 19:55:04.082314 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:04.082216 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69647495b-szrmb" event={"ID":"43164e36-45f6-4070-9897-95da9982dd10","Type":"ContainerStarted","Data":"b0e3a756940f27d6c7b4402bcd15dbf97314f66e5529ddb8c449815c6bb58931"} Apr 16 19:55:04.082314 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:04.082284 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:04.097430 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:04.097344 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rfcgq" podStartSLOduration=33.409586145 podStartE2EDuration="35.097327637s" podCreationTimestamp="2026-04-16 19:54:29 +0000 UTC" firstStartedPulling="2026-04-16 19:55:01.932220206 +0000 UTC m=+64.772025585" lastFinishedPulling="2026-04-16 19:55:03.619961695 +0000 UTC m=+66.459767077" observedRunningTime="2026-04-16 19:55:04.096828091 +0000 UTC m=+66.936633489" watchObservedRunningTime="2026-04-16 19:55:04.097327637 +0000 UTC m=+66.937133040" Apr 16 19:55:04.122607 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:04.122553 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5jw6k" podStartSLOduration=33.413373334 podStartE2EDuration="35.122537711s" podCreationTimestamp="2026-04-16 19:54:29 +0000 UTC" firstStartedPulling="2026-04-16 19:55:01.907929445 +0000 UTC m=+64.747734825" lastFinishedPulling="2026-04-16 19:55:03.617093812 +0000 UTC m=+66.456899202" observedRunningTime="2026-04-16 19:55:04.122371053 +0000 UTC m=+66.962176456" watchObservedRunningTime="2026-04-16 19:55:04.122537711 +0000 UTC m=+66.962343094" Apr 16 19:55:04.145075 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:04.145015 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-69647495b-szrmb" podStartSLOduration=1.144993126 podStartE2EDuration="1.144993126s" podCreationTimestamp="2026-04-16 19:55:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:55:04.143349276 +0000 UTC m=+66.983154676" watchObservedRunningTime="2026-04-16 19:55:04.144993126 +0000 UTC m=+66.984798528" Apr 16 19:55:05.089209 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:05.089169 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vx6n5" event={"ID":"5b25a2cc-2c1d-4c3d-93ff-f73223624d78","Type":"ContainerStarted","Data":"6193dfb12bc20003f2d510ac4554ccbc8180f7d61870a7206f9f0595ea1f0617"} Apr 16 19:55:05.091670 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:05.091623 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x64wb" event={"ID":"b2af1db9-97e2-4754-951b-1299f4f7a507","Type":"ContainerStarted","Data":"aa8bf3bf4b02c938ffa50a1a2a75ca3166918e853fc0d27559553751643c04cf"} Apr 16 19:55:06.096670 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:06.096626 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vx6n5" event={"ID":"5b25a2cc-2c1d-4c3d-93ff-f73223624d78","Type":"ContainerStarted","Data":"4b55dcc2ddd0701f83c90837f963695e5300d1d37b14e830d54d9566e345f2b6"} Apr 16 19:55:06.118073 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:06.118011 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vx6n5" podStartSLOduration=68.197633464 podStartE2EDuration="1m9.1179901s" podCreationTimestamp="2026-04-16 19:53:57 +0000 UTC" firstStartedPulling="2026-04-16 19:55:03.982234886 +0000 UTC m=+66.822040265" lastFinishedPulling="2026-04-16 19:55:04.902591512 +0000 UTC m=+67.742396901" observedRunningTime="2026-04-16 19:55:06.117134315 +0000 UTC m=+68.956939749" watchObservedRunningTime="2026-04-16 19:55:06.1179901 +0000 UTC m=+68.957795502" Apr 16 19:55:06.129833 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:06.129803 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qjmfn"] Apr 16 19:55:06.134389 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:06.134367 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qjmfn" Apr 16 19:55:06.137530 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:06.137385 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 19:55:06.137530 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:06.137500 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-gzglx\"" Apr 16 19:55:06.146598 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:06.146360 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qjmfn"] Apr 16 19:55:06.195452 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:06.195408 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f95ab465-dbfa-421d-a505-279bdea9be8c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qjmfn\" (UID: \"f95ab465-dbfa-421d-a505-279bdea9be8c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qjmfn" Apr 16 19:55:06.296641 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:06.296609 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f95ab465-dbfa-421d-a505-279bdea9be8c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qjmfn\" (UID: \"f95ab465-dbfa-421d-a505-279bdea9be8c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qjmfn" Apr 16 19:55:06.296752 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:55:06.296716 2561 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 19:55:06.296796 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:55:06.296766 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f95ab465-dbfa-421d-a505-279bdea9be8c-tls-certificates podName:f95ab465-dbfa-421d-a505-279bdea9be8c nodeName:}" failed. No retries permitted until 2026-04-16 19:55:06.796750313 +0000 UTC m=+69.636555693 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/f95ab465-dbfa-421d-a505-279bdea9be8c-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-qjmfn" (UID: "f95ab465-dbfa-421d-a505-279bdea9be8c") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 19:55:06.800394 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:06.800361 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f95ab465-dbfa-421d-a505-279bdea9be8c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qjmfn\" (UID: \"f95ab465-dbfa-421d-a505-279bdea9be8c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qjmfn" Apr 16 19:55:06.802645 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:06.802615 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f95ab465-dbfa-421d-a505-279bdea9be8c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qjmfn\" (UID: \"f95ab465-dbfa-421d-a505-279bdea9be8c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qjmfn" Apr 16 19:55:07.001176 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:07.001148 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qq6lx" Apr 16 19:55:07.048099 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:07.048070 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qjmfn" Apr 16 19:55:07.104214 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:07.104180 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x64wb" event={"ID":"b2af1db9-97e2-4754-951b-1299f4f7a507","Type":"ContainerStarted","Data":"667fd536435835069c7adab951656fc6c8532eb66e67f0769e9256823663833f"} Apr 16 19:55:07.130322 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:07.129855 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-x64wb" podStartSLOduration=1.900914038 podStartE2EDuration="4.129835543s" podCreationTimestamp="2026-04-16 19:55:03 +0000 UTC" firstStartedPulling="2026-04-16 19:55:04.037131856 +0000 UTC m=+66.876937238" lastFinishedPulling="2026-04-16 19:55:06.266053361 +0000 UTC m=+69.105858743" observedRunningTime="2026-04-16 19:55:07.128591964 +0000 UTC m=+69.968397378" watchObservedRunningTime="2026-04-16 19:55:07.129835543 +0000 UTC m=+69.969640944" Apr 16 19:55:07.173129 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:07.173099 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qjmfn"] Apr 16 19:55:07.176456 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:55:07.176417 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf95ab465_dbfa_421d_a505_279bdea9be8c.slice/crio-b5483323549d46a8faffd38eba1df32c65443b9a3d98d2b55bede33eefa8faf8 WatchSource:0}: Error finding container b5483323549d46a8faffd38eba1df32c65443b9a3d98d2b55bede33eefa8faf8: Status 404 returned error can't find the container with id b5483323549d46a8faffd38eba1df32c65443b9a3d98d2b55bede33eefa8faf8 Apr 16 19:55:08.108578 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:08.108535 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qjmfn" event={"ID":"f95ab465-dbfa-421d-a505-279bdea9be8c","Type":"ContainerStarted","Data":"b5483323549d46a8faffd38eba1df32c65443b9a3d98d2b55bede33eefa8faf8"} Apr 16 19:55:09.111905 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:09.111870 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qjmfn" event={"ID":"f95ab465-dbfa-421d-a505-279bdea9be8c","Type":"ContainerStarted","Data":"98827b113bfae50768d34c3121a301edfd72c8a1d80e632709147ad2cdb6b2f4"} Apr 16 19:55:09.112369 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:09.112134 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qjmfn" Apr 16 19:55:09.116710 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:09.116690 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qjmfn" Apr 16 19:55:09.136657 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:09.136605 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qjmfn" podStartSLOduration=2.117589278 podStartE2EDuration="3.136593065s" podCreationTimestamp="2026-04-16 19:55:06 +0000 UTC" firstStartedPulling="2026-04-16 19:55:07.178364441 +0000 UTC m=+70.018169821" lastFinishedPulling="2026-04-16 19:55:08.197368217 +0000 UTC m=+71.037173608" observedRunningTime="2026-04-16 19:55:09.13623339 +0000 UTC m=+71.976038792" watchObservedRunningTime="2026-04-16 19:55:09.136593065 +0000 UTC m=+71.976398465" Apr 16 19:55:10.208115 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.208077 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-hb5qb"] Apr 16 19:55:10.211023 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.211006 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" Apr 16 19:55:10.213778 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.213759 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 19:55:10.215070 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.215053 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 19:55:10.215150 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.215093 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 19:55:10.215331 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.215315 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 19:55:10.215415 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.215363 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-z7njj\"" Apr 16 19:55:10.215470 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.215415 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 19:55:10.220525 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.220506 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-hb5qb"] Apr 16 19:55:10.325132 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.325094 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5b9eafb4-5fb9-437a-8757-c4b78d699f1f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-hb5qb\" (UID: \"5b9eafb4-5fb9-437a-8757-c4b78d699f1f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" Apr 16 19:55:10.325360 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.325149 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp9z7\" (UniqueName: \"kubernetes.io/projected/5b9eafb4-5fb9-437a-8757-c4b78d699f1f-kube-api-access-sp9z7\") pod \"prometheus-operator-5676c8c784-hb5qb\" (UID: \"5b9eafb4-5fb9-437a-8757-c4b78d699f1f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" Apr 16 19:55:10.325360 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.325170 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b9eafb4-5fb9-437a-8757-c4b78d699f1f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-hb5qb\" (UID: \"5b9eafb4-5fb9-437a-8757-c4b78d699f1f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" Apr 16 19:55:10.325360 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.325267 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b9eafb4-5fb9-437a-8757-c4b78d699f1f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-hb5qb\" (UID: \"5b9eafb4-5fb9-437a-8757-c4b78d699f1f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" Apr 16 19:55:10.426214 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.426181 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sp9z7\" (UniqueName: \"kubernetes.io/projected/5b9eafb4-5fb9-437a-8757-c4b78d699f1f-kube-api-access-sp9z7\") pod \"prometheus-operator-5676c8c784-hb5qb\" (UID: \"5b9eafb4-5fb9-437a-8757-c4b78d699f1f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" Apr 16 19:55:10.426214 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.426222 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b9eafb4-5fb9-437a-8757-c4b78d699f1f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-hb5qb\" (UID: \"5b9eafb4-5fb9-437a-8757-c4b78d699f1f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" Apr 16 19:55:10.426439 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.426276 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b9eafb4-5fb9-437a-8757-c4b78d699f1f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-hb5qb\" (UID: \"5b9eafb4-5fb9-437a-8757-c4b78d699f1f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" Apr 16 19:55:10.426439 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.426308 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5b9eafb4-5fb9-437a-8757-c4b78d699f1f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-hb5qb\" (UID: \"5b9eafb4-5fb9-437a-8757-c4b78d699f1f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" Apr 16 19:55:10.426439 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:55:10.426428 2561 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 19:55:10.426555 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:55:10.426501 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b9eafb4-5fb9-437a-8757-c4b78d699f1f-prometheus-operator-tls podName:5b9eafb4-5fb9-437a-8757-c4b78d699f1f nodeName:}" failed. No retries permitted until 2026-04-16 19:55:10.926481703 +0000 UTC m=+73.766287096 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/5b9eafb4-5fb9-437a-8757-c4b78d699f1f-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-hb5qb" (UID: "5b9eafb4-5fb9-437a-8757-c4b78d699f1f") : secret "prometheus-operator-tls" not found Apr 16 19:55:10.427022 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.427001 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5b9eafb4-5fb9-437a-8757-c4b78d699f1f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-hb5qb\" (UID: \"5b9eafb4-5fb9-437a-8757-c4b78d699f1f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" Apr 16 19:55:10.428691 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.428673 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5b9eafb4-5fb9-437a-8757-c4b78d699f1f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-hb5qb\" (UID: \"5b9eafb4-5fb9-437a-8757-c4b78d699f1f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" Apr 16 19:55:10.436797 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.436778 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp9z7\" (UniqueName: \"kubernetes.io/projected/5b9eafb4-5fb9-437a-8757-c4b78d699f1f-kube-api-access-sp9z7\") pod \"prometheus-operator-5676c8c784-hb5qb\" (UID: \"5b9eafb4-5fb9-437a-8757-c4b78d699f1f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" Apr 16 19:55:10.930920 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.930874 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b9eafb4-5fb9-437a-8757-c4b78d699f1f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-hb5qb\" (UID: \"5b9eafb4-5fb9-437a-8757-c4b78d699f1f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" Apr 16 19:55:10.933258 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:10.933230 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b9eafb4-5fb9-437a-8757-c4b78d699f1f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-hb5qb\" (UID: \"5b9eafb4-5fb9-437a-8757-c4b78d699f1f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" Apr 16 19:55:11.121355 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:11.121329 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" Apr 16 19:55:11.260049 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:11.259947 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-hb5qb"] Apr 16 19:55:11.263371 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:55:11.263333 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b9eafb4_5fb9_437a_8757_c4b78d699f1f.slice/crio-02af1b7aa443cf250018cb162da79b9cc093c5a7e6c194cbea5ad19e323625a2 WatchSource:0}: Error finding container 02af1b7aa443cf250018cb162da79b9cc093c5a7e6c194cbea5ad19e323625a2: Status 404 returned error can't find the container with id 02af1b7aa443cf250018cb162da79b9cc093c5a7e6c194cbea5ad19e323625a2 Apr 16 19:55:12.121591 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:12.121548 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" event={"ID":"5b9eafb4-5fb9-437a-8757-c4b78d699f1f","Type":"ContainerStarted","Data":"02af1b7aa443cf250018cb162da79b9cc093c5a7e6c194cbea5ad19e323625a2"} Apr 16 19:55:13.126454 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:13.126420 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" event={"ID":"5b9eafb4-5fb9-437a-8757-c4b78d699f1f","Type":"ContainerStarted","Data":"d2739f0979222b4a995f09bffbdfd719ab6e717e411cd6e8b0a27aede188b23b"} Apr 16 19:55:13.126454 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:13.126457 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" event={"ID":"5b9eafb4-5fb9-437a-8757-c4b78d699f1f","Type":"ContainerStarted","Data":"2135dfabe6df5058cb78d4c135ad0cc21ccf16113a767fb7d2cd33e43423bbd2"} Apr 16 19:55:13.148463 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:13.148406 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-hb5qb" podStartSLOduration=2.053807603 podStartE2EDuration="3.148391895s" podCreationTimestamp="2026-04-16 19:55:10 +0000 UTC" firstStartedPulling="2026-04-16 19:55:11.265303433 +0000 UTC m=+74.105108812" lastFinishedPulling="2026-04-16 19:55:12.359887723 +0000 UTC m=+75.199693104" observedRunningTime="2026-04-16 19:55:13.147811162 +0000 UTC m=+75.987616563" watchObservedRunningTime="2026-04-16 19:55:13.148391895 +0000 UTC m=+75.988197340" Apr 16 19:55:14.094429 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.094393 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5jw6k" Apr 16 19:55:14.636572 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.636538 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-v9qhq"] Apr 16 19:55:14.640227 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.640200 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.643375 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.643127 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8qnsr\"" Apr 16 19:55:14.643375 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.643178 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 19:55:14.643375 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.643226 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 19:55:14.643375 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.643244 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 19:55:14.662891 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.662861 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xjz7t"] Apr 16 19:55:14.665993 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.665974 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.671324 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.671299 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 19:55:14.672681 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.672662 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 19:55:14.672973 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.672956 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-j87gv\"" Apr 16 19:55:14.673737 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.673720 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 19:55:14.681432 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.681409 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xjz7t"] Apr 16 19:55:14.763491 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.763457 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/757fd1bb-7b91-41e1-9d8e-881aa2286adc-node-exporter-wtmp\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.763491 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.763495 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/757fd1bb-7b91-41e1-9d8e-881aa2286adc-sys\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.763689 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.763521 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c5f736cf-45d9-4458-9bc5-c8de66533a6b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xjz7t\" (UID: \"c5f736cf-45d9-4458-9bc5-c8de66533a6b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.763689 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.763577 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c5f736cf-45d9-4458-9bc5-c8de66533a6b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xjz7t\" (UID: \"c5f736cf-45d9-4458-9bc5-c8de66533a6b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.763689 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.763623 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/757fd1bb-7b91-41e1-9d8e-881aa2286adc-node-exporter-tls\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.763689 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.763643 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/757fd1bb-7b91-41e1-9d8e-881aa2286adc-metrics-client-ca\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.763689 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.763672 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfv6w\" (UniqueName: \"kubernetes.io/projected/c5f736cf-45d9-4458-9bc5-c8de66533a6b-kube-api-access-bfv6w\") pod \"kube-state-metrics-69db897b98-xjz7t\" (UID: \"c5f736cf-45d9-4458-9bc5-c8de66533a6b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.763965 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.763704 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/757fd1bb-7b91-41e1-9d8e-881aa2286adc-node-exporter-textfile\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.763965 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.763761 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c5f736cf-45d9-4458-9bc5-c8de66533a6b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xjz7t\" (UID: \"c5f736cf-45d9-4458-9bc5-c8de66533a6b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.763965 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.763809 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/757fd1bb-7b91-41e1-9d8e-881aa2286adc-root\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.763965 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.763857 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6t4n\" (UniqueName: \"kubernetes.io/projected/757fd1bb-7b91-41e1-9d8e-881aa2286adc-kube-api-access-l6t4n\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.763965 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.763887 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/757fd1bb-7b91-41e1-9d8e-881aa2286adc-node-exporter-accelerators-collector-config\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.763965 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.763917 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c5f736cf-45d9-4458-9bc5-c8de66533a6b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xjz7t\" (UID: \"c5f736cf-45d9-4458-9bc5-c8de66533a6b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.764179 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.763967 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/757fd1bb-7b91-41e1-9d8e-881aa2286adc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.764179 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.764001 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c5f736cf-45d9-4458-9bc5-c8de66533a6b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xjz7t\" (UID: \"c5f736cf-45d9-4458-9bc5-c8de66533a6b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.864691 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.864655 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/757fd1bb-7b91-41e1-9d8e-881aa2286adc-root\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.864879 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.864703 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6t4n\" (UniqueName: \"kubernetes.io/projected/757fd1bb-7b91-41e1-9d8e-881aa2286adc-kube-api-access-l6t4n\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.864879 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.864727 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/757fd1bb-7b91-41e1-9d8e-881aa2286adc-node-exporter-accelerators-collector-config\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.864879 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.864746 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c5f736cf-45d9-4458-9bc5-c8de66533a6b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xjz7t\" (UID: \"c5f736cf-45d9-4458-9bc5-c8de66533a6b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.864879 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.864773 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/757fd1bb-7b91-41e1-9d8e-881aa2286adc-root\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.864879 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.864791 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/757fd1bb-7b91-41e1-9d8e-881aa2286adc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.864879 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.864836 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c5f736cf-45d9-4458-9bc5-c8de66533a6b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xjz7t\" (UID: \"c5f736cf-45d9-4458-9bc5-c8de66533a6b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.864879 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.864873 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/757fd1bb-7b91-41e1-9d8e-881aa2286adc-node-exporter-wtmp\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.865237 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.864905 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/757fd1bb-7b91-41e1-9d8e-881aa2286adc-sys\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.865237 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.864946 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c5f736cf-45d9-4458-9bc5-c8de66533a6b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xjz7t\" (UID: \"c5f736cf-45d9-4458-9bc5-c8de66533a6b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.865237 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.864973 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c5f736cf-45d9-4458-9bc5-c8de66533a6b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xjz7t\" (UID: \"c5f736cf-45d9-4458-9bc5-c8de66533a6b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.865237 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.865011 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/757fd1bb-7b91-41e1-9d8e-881aa2286adc-node-exporter-tls\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.865237 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.865039 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/757fd1bb-7b91-41e1-9d8e-881aa2286adc-metrics-client-ca\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.865237 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.865086 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfv6w\" (UniqueName: \"kubernetes.io/projected/c5f736cf-45d9-4458-9bc5-c8de66533a6b-kube-api-access-bfv6w\") pod \"kube-state-metrics-69db897b98-xjz7t\" (UID: \"c5f736cf-45d9-4458-9bc5-c8de66533a6b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.865237 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.865121 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/757fd1bb-7b91-41e1-9d8e-881aa2286adc-node-exporter-textfile\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.865237 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.865151 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c5f736cf-45d9-4458-9bc5-c8de66533a6b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xjz7t\" (UID: \"c5f736cf-45d9-4458-9bc5-c8de66533a6b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.865646 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.865582 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/757fd1bb-7b91-41e1-9d8e-881aa2286adc-node-exporter-accelerators-collector-config\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.865709 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.865653 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/757fd1bb-7b91-41e1-9d8e-881aa2286adc-sys\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.865764 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.865704 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c5f736cf-45d9-4458-9bc5-c8de66533a6b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xjz7t\" (UID: \"c5f736cf-45d9-4458-9bc5-c8de66533a6b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.865764 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.865733 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c5f736cf-45d9-4458-9bc5-c8de66533a6b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xjz7t\" (UID: \"c5f736cf-45d9-4458-9bc5-c8de66533a6b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.865860 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.865763 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/757fd1bb-7b91-41e1-9d8e-881aa2286adc-node-exporter-wtmp\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.865860 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.865785 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/757fd1bb-7b91-41e1-9d8e-881aa2286adc-metrics-client-ca\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.866053 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.866027 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/757fd1bb-7b91-41e1-9d8e-881aa2286adc-node-exporter-textfile\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.866200 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.866181 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c5f736cf-45d9-4458-9bc5-c8de66533a6b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xjz7t\" (UID: \"c5f736cf-45d9-4458-9bc5-c8de66533a6b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.867567 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.867542 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/757fd1bb-7b91-41e1-9d8e-881aa2286adc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.867939 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.867920 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c5f736cf-45d9-4458-9bc5-c8de66533a6b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xjz7t\" (UID: \"c5f736cf-45d9-4458-9bc5-c8de66533a6b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.868041 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.868024 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/757fd1bb-7b91-41e1-9d8e-881aa2286adc-node-exporter-tls\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.868300 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.868281 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c5f736cf-45d9-4458-9bc5-c8de66533a6b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xjz7t\" (UID: \"c5f736cf-45d9-4458-9bc5-c8de66533a6b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.882031 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.882007 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6t4n\" (UniqueName: \"kubernetes.io/projected/757fd1bb-7b91-41e1-9d8e-881aa2286adc-kube-api-access-l6t4n\") pod \"node-exporter-v9qhq\" (UID: \"757fd1bb-7b91-41e1-9d8e-881aa2286adc\") " pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.884489 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.884464 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfv6w\" (UniqueName: \"kubernetes.io/projected/c5f736cf-45d9-4458-9bc5-c8de66533a6b-kube-api-access-bfv6w\") pod \"kube-state-metrics-69db897b98-xjz7t\" (UID: \"c5f736cf-45d9-4458-9bc5-c8de66533a6b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:14.952029 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.951948 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v9qhq" Apr 16 19:55:14.962054 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:55:14.962022 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod757fd1bb_7b91_41e1_9d8e_881aa2286adc.slice/crio-da25e04ac98e1254a1fc16dc880704402864550152fc4ea04c5d5228e8ee27a2 WatchSource:0}: Error finding container da25e04ac98e1254a1fc16dc880704402864550152fc4ea04c5d5228e8ee27a2: Status 404 returned error can't find the container with id da25e04ac98e1254a1fc16dc880704402864550152fc4ea04c5d5228e8ee27a2 Apr 16 19:55:14.975065 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:14.975033 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" Apr 16 19:55:15.117032 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:15.116994 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xjz7t"] Apr 16 19:55:15.120656 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:55:15.120617 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5f736cf_45d9_4458_9bc5_c8de66533a6b.slice/crio-588cf9ab563cb9cb4692341286bfb437f38e20f34f2f67529a3aa41bb02cd664 WatchSource:0}: Error finding container 588cf9ab563cb9cb4692341286bfb437f38e20f34f2f67529a3aa41bb02cd664: Status 404 returned error can't find the container with id 588cf9ab563cb9cb4692341286bfb437f38e20f34f2f67529a3aa41bb02cd664 Apr 16 19:55:15.132417 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:15.132390 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v9qhq" event={"ID":"757fd1bb-7b91-41e1-9d8e-881aa2286adc","Type":"ContainerStarted","Data":"da25e04ac98e1254a1fc16dc880704402864550152fc4ea04c5d5228e8ee27a2"} Apr 16 19:55:15.133503 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:15.133434 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" event={"ID":"c5f736cf-45d9-4458-9bc5-c8de66533a6b","Type":"ContainerStarted","Data":"588cf9ab563cb9cb4692341286bfb437f38e20f34f2f67529a3aa41bb02cd664"} Apr 16 19:55:17.142364 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:17.142336 2561 generic.go:358] "Generic (PLEG): container finished" podID="757fd1bb-7b91-41e1-9d8e-881aa2286adc" containerID="bf0fc1379d4726a7673ff8fcd221921480706934279aad5dbafb0504aa3c448e" exitCode=0 Apr 16 19:55:17.142733 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:17.142419 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v9qhq" event={"ID":"757fd1bb-7b91-41e1-9d8e-881aa2286adc","Type":"ContainerDied","Data":"bf0fc1379d4726a7673ff8fcd221921480706934279aad5dbafb0504aa3c448e"} Apr 16 19:55:17.144068 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:17.144048 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" event={"ID":"c5f736cf-45d9-4458-9bc5-c8de66533a6b","Type":"ContainerStarted","Data":"0e27add170ec0a6f6dd84a92e67eee9f2d417a0738c56c26640771205c2bc93c"} Apr 16 19:55:17.144162 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:17.144075 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" event={"ID":"c5f736cf-45d9-4458-9bc5-c8de66533a6b","Type":"ContainerStarted","Data":"bcdf5ed7d5914b26074fda19dd73700178ee61916e065cba92d02d7498d01179"} Apr 16 19:55:18.148895 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:18.148852 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v9qhq" event={"ID":"757fd1bb-7b91-41e1-9d8e-881aa2286adc","Type":"ContainerStarted","Data":"a1845632ccfb915c609d4b7364e80ea8565bc561ac92825d3f5832aae4ff2dd0"} Apr 16 19:55:18.148895 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:18.148897 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v9qhq" event={"ID":"757fd1bb-7b91-41e1-9d8e-881aa2286adc","Type":"ContainerStarted","Data":"78cdfbf324082542d4866e2b65cf9ce50cfaaf4359476a3a100d417cfb5eed59"} Apr 16 19:55:18.150659 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:18.150627 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" event={"ID":"c5f736cf-45d9-4458-9bc5-c8de66533a6b","Type":"ContainerStarted","Data":"fa136854b3ae51fde3061c30152bad5e09cc2c23de35b493c8fab86af5c00093"} Apr 16 19:55:18.171832 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:18.171792 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-v9qhq" podStartSLOduration=2.578188925 podStartE2EDuration="4.171780528s" podCreationTimestamp="2026-04-16 19:55:14 +0000 UTC" firstStartedPulling="2026-04-16 19:55:14.964265544 +0000 UTC m=+77.804070930" lastFinishedPulling="2026-04-16 19:55:16.557857149 +0000 UTC m=+79.397662533" observedRunningTime="2026-04-16 19:55:18.170562486 +0000 UTC m=+81.010367903" watchObservedRunningTime="2026-04-16 19:55:18.171780528 +0000 UTC m=+81.011585929" Apr 16 19:55:18.191337 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:18.191281 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-xjz7t" podStartSLOduration=2.480565169 podStartE2EDuration="4.191240702s" podCreationTimestamp="2026-04-16 19:55:14 +0000 UTC" firstStartedPulling="2026-04-16 19:55:15.122824969 +0000 UTC m=+77.962630347" lastFinishedPulling="2026-04-16 19:55:16.833500501 +0000 UTC m=+79.673305880" observedRunningTime="2026-04-16 19:55:18.19055253 +0000 UTC m=+81.030357927" watchObservedRunningTime="2026-04-16 19:55:18.191240702 +0000 UTC m=+81.031046104" Apr 16 19:55:19.356405 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:19.356369 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-fpvmn"] Apr 16 19:55:19.359615 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:19.359598 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fpvmn" Apr 16 19:55:19.363898 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:19.363864 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-fms4m\"" Apr 16 19:55:19.364033 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:19.363972 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 19:55:19.369094 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:19.369067 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-fpvmn"] Apr 16 19:55:19.502124 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:19.502092 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/943b61e6-248b-4330-9e80-81ac4a4dc9cd-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fpvmn\" (UID: \"943b61e6-248b-4330-9e80-81ac4a4dc9cd\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fpvmn" Apr 16 19:55:19.603299 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:19.603264 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/943b61e6-248b-4330-9e80-81ac4a4dc9cd-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fpvmn\" (UID: \"943b61e6-248b-4330-9e80-81ac4a4dc9cd\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fpvmn" Apr 16 19:55:19.605625 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:19.605605 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/943b61e6-248b-4330-9e80-81ac4a4dc9cd-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fpvmn\" (UID: \"943b61e6-248b-4330-9e80-81ac4a4dc9cd\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fpvmn" Apr 16 19:55:19.670923 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:19.670828 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fpvmn" Apr 16 19:55:19.787325 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:19.787295 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-fpvmn"] Apr 16 19:55:19.791218 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:55:19.791188 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod943b61e6_248b_4330_9e80_81ac4a4dc9cd.slice/crio-076ecc7166536174efb7d924ea1a2b4ffe4ffa928d0289587cd13db3886c8e25 WatchSource:0}: Error finding container 076ecc7166536174efb7d924ea1a2b4ffe4ffa928d0289587cd13db3886c8e25: Status 404 returned error can't find the container with id 076ecc7166536174efb7d924ea1a2b4ffe4ffa928d0289587cd13db3886c8e25 Apr 16 19:55:20.157077 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.157043 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fpvmn" event={"ID":"943b61e6-248b-4330-9e80-81ac4a4dc9cd","Type":"ContainerStarted","Data":"076ecc7166536174efb7d924ea1a2b4ffe4ffa928d0289587cd13db3886c8e25"} Apr 16 19:55:20.898706 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.898113 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:55:20.902544 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.902513 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:20.906072 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.906030 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 19:55:20.906072 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.906052 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 19:55:20.908760 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.908582 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 19:55:20.908760 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.908700 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 19:55:20.908926 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.908815 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 19:55:20.909381 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.909360 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 19:55:20.909732 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.909646 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 19:55:20.909841 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.909734 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-e5gcma5esfmoi\"" Apr 16 19:55:20.910125 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.910107 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 19:55:20.910125 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.910122 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 19:55:20.910292 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.910170 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 19:55:20.910628 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.910608 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 19:55:20.910721 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.910684 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 19:55:20.910721 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.910684 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-d6qch\"" Apr 16 19:55:20.913618 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.913146 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 19:55:20.920380 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:20.919569 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:55:21.014836 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.014789 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.015023 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.014843 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.015023 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.014882 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-web-config\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.015023 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.014910 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.015023 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.014972 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-config\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.015023 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.015008 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c68fa57d-b8c4-40fd-af10-efcd03b1720b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.015241 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.015039 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.015241 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.015094 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.015241 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.015137 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c68fa57d-b8c4-40fd-af10-efcd03b1720b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.015241 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.015180 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c68fa57d-b8c4-40fd-af10-efcd03b1720b-config-out\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.015241 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.015214 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.015479 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.015269 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.015479 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.015312 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.015479 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.015349 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6lnw\" (UniqueName: \"kubernetes.io/projected/c68fa57d-b8c4-40fd-af10-efcd03b1720b-kube-api-access-h6lnw\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.015479 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.015381 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.015479 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.015445 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.015710 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.015499 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.015710 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.015530 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.118053 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118023 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.118162 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118065 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.118162 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118102 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.118162 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118127 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.118355 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118301 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.118355 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118340 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.118455 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118412 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-web-config\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.118455 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118445 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.118558 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118479 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-config\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.118558 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118505 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c68fa57d-b8c4-40fd-af10-efcd03b1720b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.118558 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118535 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.118708 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118586 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.118708 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118616 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c68fa57d-b8c4-40fd-af10-efcd03b1720b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.118708 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118640 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c68fa57d-b8c4-40fd-af10-efcd03b1720b-config-out\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.118708 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118665 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.118708 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118694 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.118954 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118734 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.118954 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118778 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6lnw\" (UniqueName: \"kubernetes.io/projected/c68fa57d-b8c4-40fd-af10-efcd03b1720b-kube-api-access-h6lnw\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.119067 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.118975 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.119850 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.119817 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.120357 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.120330 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.122167 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.122136 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.123755 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.123682 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.123755 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.123711 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-web-config\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.126034 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.126007 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.129320 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.126445 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.129320 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.126711 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.129320 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.126835 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.129320 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.126876 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-config\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.129320 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.127028 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c68fa57d-b8c4-40fd-af10-efcd03b1720b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.129320 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.127238 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c68fa57d-b8c4-40fd-af10-efcd03b1720b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.129320 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.127786 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.131843 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.129973 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6lnw\" (UniqueName: \"kubernetes.io/projected/c68fa57d-b8c4-40fd-af10-efcd03b1720b-kube-api-access-h6lnw\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.135095 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.134517 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c68fa57d-b8c4-40fd-af10-efcd03b1720b-config-out\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.135095 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.134769 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.135095 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.134772 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.219876 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.219792 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.361933 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:21.361888 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:55:21.364955 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:55:21.364924 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc68fa57d_b8c4_40fd_af10_efcd03b1720b.slice/crio-089be0f16ebeafb362596215cd0b807a3cd1add3045fae0ee51e0b768e121fcc WatchSource:0}: Error finding container 089be0f16ebeafb362596215cd0b807a3cd1add3045fae0ee51e0b768e121fcc: Status 404 returned error can't find the container with id 089be0f16ebeafb362596215cd0b807a3cd1add3045fae0ee51e0b768e121fcc Apr 16 19:55:22.165649 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:22.165614 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fpvmn" event={"ID":"943b61e6-248b-4330-9e80-81ac4a4dc9cd","Type":"ContainerStarted","Data":"758bec61a78d229a9e9bea3cfefd6b359835958bb04fda828fcabc9378ccd7bf"} Apr 16 19:55:22.166226 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:22.165845 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fpvmn" Apr 16 19:55:22.167037 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:22.166999 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c68fa57d-b8c4-40fd-af10-efcd03b1720b","Type":"ContainerStarted","Data":"089be0f16ebeafb362596215cd0b807a3cd1add3045fae0ee51e0b768e121fcc"} Apr 16 19:55:22.172114 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:22.172084 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fpvmn" Apr 16 19:55:22.184222 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:22.184171 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fpvmn" podStartSLOduration=1.8710784280000001 podStartE2EDuration="3.184156186s" podCreationTimestamp="2026-04-16 19:55:19 +0000 UTC" firstStartedPulling="2026-04-16 19:55:19.793332369 +0000 UTC m=+82.633137748" lastFinishedPulling="2026-04-16 19:55:21.106410127 +0000 UTC m=+83.946215506" observedRunningTime="2026-04-16 19:55:22.182555179 +0000 UTC m=+85.022360584" watchObservedRunningTime="2026-04-16 19:55:22.184156186 +0000 UTC m=+85.023961587" Apr 16 19:55:23.170703 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:23.170664 2561 generic.go:358] "Generic (PLEG): container finished" podID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerID="ec8ee76d5209978c532b99d11876374eb889cdd4180d576efefbef242d33c458" exitCode=0 Apr 16 19:55:23.171088 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:23.170748 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c68fa57d-b8c4-40fd-af10-efcd03b1720b","Type":"ContainerDied","Data":"ec8ee76d5209978c532b99d11876374eb889cdd4180d576efefbef242d33c458"} Apr 16 19:55:24.090062 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:24.090032 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:55:25.096289 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:25.096233 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-69647495b-szrmb" Apr 16 19:55:26.182099 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:26.182062 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c68fa57d-b8c4-40fd-af10-efcd03b1720b","Type":"ContainerStarted","Data":"a3359dfc38e226836220a04532848ad40c44315b706aea74e89fb77781b37085"} Apr 16 19:55:26.182527 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:26.182102 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c68fa57d-b8c4-40fd-af10-efcd03b1720b","Type":"ContainerStarted","Data":"8e37cd21539a66ebb2fb9fde14484532fef4192f4ea135b8cfce6f289c6792cb"} Apr 16 19:55:28.191903 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:28.191864 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c68fa57d-b8c4-40fd-af10-efcd03b1720b","Type":"ContainerStarted","Data":"e4231a4ba1cbe69cd584a5847bacf536c97d3cb62d9cbc7f524b553247721199"} Apr 16 19:55:28.191903 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:28.191900 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c68fa57d-b8c4-40fd-af10-efcd03b1720b","Type":"ContainerStarted","Data":"cd6cddede3db107a6862fb048867838f3b7a6d8345797bb5f876184a6e884375"} Apr 16 19:55:28.191903 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:28.191910 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c68fa57d-b8c4-40fd-af10-efcd03b1720b","Type":"ContainerStarted","Data":"d09eeadbebe65489aa7f47ddfc00f427d8eacba64f884a2eaecd25b2db56efca"} Apr 16 19:55:28.192381 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:28.191919 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c68fa57d-b8c4-40fd-af10-efcd03b1720b","Type":"ContainerStarted","Data":"9329be1e586537d4eb80c7e943c1c94f6cecc9061c9619205b1556b62f0bbbfb"} Apr 16 19:55:28.222096 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:28.222039 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.250399444 podStartE2EDuration="8.222019829s" podCreationTimestamp="2026-04-16 19:55:20 +0000 UTC" firstStartedPulling="2026-04-16 19:55:21.369145066 +0000 UTC m=+84.208950460" lastFinishedPulling="2026-04-16 19:55:27.340765456 +0000 UTC m=+90.180570845" observedRunningTime="2026-04-16 19:55:28.220434189 +0000 UTC m=+91.060239612" watchObservedRunningTime="2026-04-16 19:55:28.222019829 +0000 UTC m=+91.061825230" Apr 16 19:55:29.102569 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.102519 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" podUID="2c7aae42-1d5a-4a39-8da1-c98f99b37fd1" containerName="registry" containerID="cri-o://9f8de30969a5f58b1e46a84ee24db014fcdaba3c374f828cb584541a5d0bfe8f" gracePeriod=30 Apr 16 19:55:29.344554 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.344531 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:55:29.494155 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.494057 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-image-registry-private-configuration\") pod \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " Apr 16 19:55:29.494155 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.494094 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-trusted-ca\") pod \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " Apr 16 19:55:29.494155 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.494132 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-ca-trust-extracted\") pod \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " Apr 16 19:55:29.494451 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.494184 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-installation-pull-secrets\") pod \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " Apr 16 19:55:29.494451 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.494232 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-749z8\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-kube-api-access-749z8\") pod \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " Apr 16 19:55:29.494451 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.494322 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls\") pod \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " Apr 16 19:55:29.494451 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.494367 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-bound-sa-token\") pod \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " Apr 16 19:55:29.494451 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.494397 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-certificates\") pod \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\" (UID: \"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1\") " Apr 16 19:55:29.495158 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.495021 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1" (UID: "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:29.495158 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.495119 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1" (UID: "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:29.497018 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.496974 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1" (UID: "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:29.497112 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.497029 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-kube-api-access-749z8" (OuterVolumeSpecName: "kube-api-access-749z8") pod "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1" (UID: "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1"). InnerVolumeSpecName "kube-api-access-749z8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:29.497112 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.497091 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1" (UID: "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:29.497209 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.497112 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1" (UID: "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:29.497209 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.497122 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1" (UID: "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:29.502548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.502525 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1" (UID: "2c7aae42-1d5a-4a39-8da1-c98f99b37fd1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:55:29.596011 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.595972 2561 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-image-registry-private-configuration\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:55:29.596011 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.596007 2561 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-trusted-ca\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:55:29.596011 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.596017 2561 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-ca-trust-extracted\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:55:29.596241 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.596027 2561 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-installation-pull-secrets\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:55:29.596241 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.596038 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-749z8\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-kube-api-access-749z8\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:55:29.596241 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.596046 2561 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-tls\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:55:29.596241 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.596056 2561 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-bound-sa-token\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:55:29.596241 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:29.596064 2561 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1-registry-certificates\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:55:30.199595 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:30.199480 2561 generic.go:358] "Generic (PLEG): container finished" podID="2c7aae42-1d5a-4a39-8da1-c98f99b37fd1" containerID="9f8de30969a5f58b1e46a84ee24db014fcdaba3c374f828cb584541a5d0bfe8f" exitCode=0 Apr 16 19:55:30.199780 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:30.199732 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" event={"ID":"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1","Type":"ContainerDied","Data":"9f8de30969a5f58b1e46a84ee24db014fcdaba3c374f828cb584541a5d0bfe8f"} Apr 16 19:55:30.199780 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:30.199768 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" event={"ID":"2c7aae42-1d5a-4a39-8da1-c98f99b37fd1","Type":"ContainerDied","Data":"a651b8bceecdd6471fe7d8e7e51501977da5c7cecc012bc814bb4eaa1efcb961"} Apr 16 19:55:30.199872 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:30.199790 2561 scope.go:117] "RemoveContainer" containerID="9f8de30969a5f58b1e46a84ee24db014fcdaba3c374f828cb584541a5d0bfe8f" Apr 16 19:55:30.199928 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:30.199908 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6454ccf87c-rjzjd" Apr 16 19:55:30.211290 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:30.211266 2561 scope.go:117] "RemoveContainer" containerID="9f8de30969a5f58b1e46a84ee24db014fcdaba3c374f828cb584541a5d0bfe8f" Apr 16 19:55:30.211800 ip-10-0-130-116 kubenswrapper[2561]: E0416 19:55:30.211769 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f8de30969a5f58b1e46a84ee24db014fcdaba3c374f828cb584541a5d0bfe8f\": container with ID starting with 9f8de30969a5f58b1e46a84ee24db014fcdaba3c374f828cb584541a5d0bfe8f not found: ID does not exist" containerID="9f8de30969a5f58b1e46a84ee24db014fcdaba3c374f828cb584541a5d0bfe8f" Apr 16 19:55:30.211913 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:30.211812 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f8de30969a5f58b1e46a84ee24db014fcdaba3c374f828cb584541a5d0bfe8f"} err="failed to get container status \"9f8de30969a5f58b1e46a84ee24db014fcdaba3c374f828cb584541a5d0bfe8f\": rpc error: code = NotFound desc = could not find container \"9f8de30969a5f58b1e46a84ee24db014fcdaba3c374f828cb584541a5d0bfe8f\": container with ID starting with 9f8de30969a5f58b1e46a84ee24db014fcdaba3c374f828cb584541a5d0bfe8f not found: ID does not exist" Apr 16 19:55:30.222688 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:30.222660 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6454ccf87c-rjzjd"] Apr 16 19:55:30.229744 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:30.229711 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6454ccf87c-rjzjd"] Apr 16 19:55:31.220629 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:31.220597 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:31.808491 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:55:31.808458 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c7aae42-1d5a-4a39-8da1-c98f99b37fd1" path="/var/lib/kubelet/pods/2c7aae42-1d5a-4a39-8da1-c98f99b37fd1/volumes" Apr 16 19:56:21.220568 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:21.220534 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:21.240082 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:21.240050 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:21.358748 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:21.358722 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:39.267149 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.267106 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:56:39.267656 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.267567 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="prometheus" containerID="cri-o://8e37cd21539a66ebb2fb9fde14484532fef4192f4ea135b8cfce6f289c6792cb" gracePeriod=600 Apr 16 19:56:39.267736 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.267641 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="kube-rbac-proxy" containerID="cri-o://cd6cddede3db107a6862fb048867838f3b7a6d8345797bb5f876184a6e884375" gracePeriod=600 Apr 16 19:56:39.267736 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.267715 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="kube-rbac-proxy-thanos" containerID="cri-o://e4231a4ba1cbe69cd584a5847bacf536c97d3cb62d9cbc7f524b553247721199" gracePeriod=600 Apr 16 19:56:39.267834 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.267774 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="kube-rbac-proxy-web" containerID="cri-o://d09eeadbebe65489aa7f47ddfc00f427d8eacba64f884a2eaecd25b2db56efca" gracePeriod=600 Apr 16 19:56:39.267834 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.267819 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="config-reloader" containerID="cri-o://a3359dfc38e226836220a04532848ad40c44315b706aea74e89fb77781b37085" gracePeriod=600 Apr 16 19:56:39.267931 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.267899 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="thanos-sidecar" containerID="cri-o://9329be1e586537d4eb80c7e943c1c94f6cecc9061c9619205b1556b62f0bbbfb" gracePeriod=600 Apr 16 19:56:39.399594 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.399564 2561 generic.go:358] "Generic (PLEG): container finished" podID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerID="e4231a4ba1cbe69cd584a5847bacf536c97d3cb62d9cbc7f524b553247721199" exitCode=0 Apr 16 19:56:39.399594 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.399592 2561 generic.go:358] "Generic (PLEG): container finished" podID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerID="cd6cddede3db107a6862fb048867838f3b7a6d8345797bb5f876184a6e884375" exitCode=0 Apr 16 19:56:39.399762 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.399602 2561 generic.go:358] "Generic (PLEG): container finished" podID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerID="d09eeadbebe65489aa7f47ddfc00f427d8eacba64f884a2eaecd25b2db56efca" exitCode=0 Apr 16 19:56:39.399762 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.399611 2561 generic.go:358] "Generic (PLEG): container finished" podID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerID="9329be1e586537d4eb80c7e943c1c94f6cecc9061c9619205b1556b62f0bbbfb" exitCode=0 Apr 16 19:56:39.399762 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.399618 2561 generic.go:358] "Generic (PLEG): container finished" podID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerID="a3359dfc38e226836220a04532848ad40c44315b706aea74e89fb77781b37085" exitCode=0 Apr 16 19:56:39.399762 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.399625 2561 generic.go:358] "Generic (PLEG): container finished" podID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerID="8e37cd21539a66ebb2fb9fde14484532fef4192f4ea135b8cfce6f289c6792cb" exitCode=0 Apr 16 19:56:39.399762 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.399633 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c68fa57d-b8c4-40fd-af10-efcd03b1720b","Type":"ContainerDied","Data":"e4231a4ba1cbe69cd584a5847bacf536c97d3cb62d9cbc7f524b553247721199"} Apr 16 19:56:39.399762 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.399677 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c68fa57d-b8c4-40fd-af10-efcd03b1720b","Type":"ContainerDied","Data":"cd6cddede3db107a6862fb048867838f3b7a6d8345797bb5f876184a6e884375"} Apr 16 19:56:39.399762 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.399692 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c68fa57d-b8c4-40fd-af10-efcd03b1720b","Type":"ContainerDied","Data":"d09eeadbebe65489aa7f47ddfc00f427d8eacba64f884a2eaecd25b2db56efca"} Apr 16 19:56:39.399762 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.399707 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c68fa57d-b8c4-40fd-af10-efcd03b1720b","Type":"ContainerDied","Data":"9329be1e586537d4eb80c7e943c1c94f6cecc9061c9619205b1556b62f0bbbfb"} Apr 16 19:56:39.399762 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.399720 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c68fa57d-b8c4-40fd-af10-efcd03b1720b","Type":"ContainerDied","Data":"a3359dfc38e226836220a04532848ad40c44315b706aea74e89fb77781b37085"} Apr 16 19:56:39.399762 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.399732 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c68fa57d-b8c4-40fd-af10-efcd03b1720b","Type":"ContainerDied","Data":"8e37cd21539a66ebb2fb9fde14484532fef4192f4ea135b8cfce6f289c6792cb"} Apr 16 19:56:39.511791 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.511769 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:39.587702 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.587666 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-configmap-kubelet-serving-ca-bundle\") pod \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " Apr 16 19:56:39.587872 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.587725 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c68fa57d-b8c4-40fd-af10-efcd03b1720b-tls-assets\") pod \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " Apr 16 19:56:39.587872 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.587761 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6lnw\" (UniqueName: \"kubernetes.io/projected/c68fa57d-b8c4-40fd-af10-efcd03b1720b-kube-api-access-h6lnw\") pod \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " Apr 16 19:56:39.587872 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.587790 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-prometheus-k8s-tls\") pod \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " Apr 16 19:56:39.588057 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.588033 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "c68fa57d-b8c4-40fd-af10-efcd03b1720b" (UID: "c68fa57d-b8c4-40fd-af10-efcd03b1720b"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:39.588199 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.587815 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " Apr 16 19:56:39.588289 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.588243 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-prometheus-k8s-rulefiles-0\") pod \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " Apr 16 19:56:39.588347 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.588338 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-web-config\") pod \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " Apr 16 19:56:39.588407 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.588364 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-kube-rbac-proxy\") pod \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " Apr 16 19:56:39.588407 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.588396 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-configmap-metrics-client-ca\") pod \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " Apr 16 19:56:39.588504 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.588429 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c68fa57d-b8c4-40fd-af10-efcd03b1720b-config-out\") pod \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " Apr 16 19:56:39.588557 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.588531 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-thanos-prometheus-http-client-file\") pod \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " Apr 16 19:56:39.588607 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.588559 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-config\") pod \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " Apr 16 19:56:39.588607 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.588590 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c68fa57d-b8c4-40fd-af10-efcd03b1720b-prometheus-k8s-db\") pod \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " Apr 16 19:56:39.588711 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.588617 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-prometheus-trusted-ca-bundle\") pod \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " Apr 16 19:56:39.588711 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.588658 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-grpc-tls\") pod \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " Apr 16 19:56:39.588825 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.588713 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-configmap-serving-certs-ca-bundle\") pod \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " Apr 16 19:56:39.588825 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.588739 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-metrics-client-certs\") pod \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " Apr 16 19:56:39.588825 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.588769 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\" (UID: \"c68fa57d-b8c4-40fd-af10-efcd03b1720b\") " Apr 16 19:56:39.589021 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.589004 2561 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:56:39.590281 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.589370 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "c68fa57d-b8c4-40fd-af10-efcd03b1720b" (UID: "c68fa57d-b8c4-40fd-af10-efcd03b1720b"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:39.590899 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.590515 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "c68fa57d-b8c4-40fd-af10-efcd03b1720b" (UID: "c68fa57d-b8c4-40fd-af10-efcd03b1720b"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:39.590899 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.590875 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "c68fa57d-b8c4-40fd-af10-efcd03b1720b" (UID: "c68fa57d-b8c4-40fd-af10-efcd03b1720b"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:39.591597 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.591571 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "c68fa57d-b8c4-40fd-af10-efcd03b1720b" (UID: "c68fa57d-b8c4-40fd-af10-efcd03b1720b"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:39.591946 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.591922 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "c68fa57d-b8c4-40fd-af10-efcd03b1720b" (UID: "c68fa57d-b8c4-40fd-af10-efcd03b1720b"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:39.592043 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.591988 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c68fa57d-b8c4-40fd-af10-efcd03b1720b-kube-api-access-h6lnw" (OuterVolumeSpecName: "kube-api-access-h6lnw") pod "c68fa57d-b8c4-40fd-af10-efcd03b1720b" (UID: "c68fa57d-b8c4-40fd-af10-efcd03b1720b"). InnerVolumeSpecName "kube-api-access-h6lnw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:56:39.592043 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.592008 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "c68fa57d-b8c4-40fd-af10-efcd03b1720b" (UID: "c68fa57d-b8c4-40fd-af10-efcd03b1720b"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:39.592427 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.592394 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c68fa57d-b8c4-40fd-af10-efcd03b1720b-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "c68fa57d-b8c4-40fd-af10-efcd03b1720b" (UID: "c68fa57d-b8c4-40fd-af10-efcd03b1720b"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:56:39.592563 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.592519 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c68fa57d-b8c4-40fd-af10-efcd03b1720b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c68fa57d-b8c4-40fd-af10-efcd03b1720b" (UID: "c68fa57d-b8c4-40fd-af10-efcd03b1720b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:56:39.595879 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.595811 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "c68fa57d-b8c4-40fd-af10-efcd03b1720b" (UID: "c68fa57d-b8c4-40fd-af10-efcd03b1720b"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:39.595980 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.595878 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "c68fa57d-b8c4-40fd-af10-efcd03b1720b" (UID: "c68fa57d-b8c4-40fd-af10-efcd03b1720b"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:39.595980 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.595896 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c68fa57d-b8c4-40fd-af10-efcd03b1720b-config-out" (OuterVolumeSpecName: "config-out") pod "c68fa57d-b8c4-40fd-af10-efcd03b1720b" (UID: "c68fa57d-b8c4-40fd-af10-efcd03b1720b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:56:39.595980 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.595960 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "c68fa57d-b8c4-40fd-af10-efcd03b1720b" (UID: "c68fa57d-b8c4-40fd-af10-efcd03b1720b"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:39.595980 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.595974 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "c68fa57d-b8c4-40fd-af10-efcd03b1720b" (UID: "c68fa57d-b8c4-40fd-af10-efcd03b1720b"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:39.596191 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.596126 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-config" (OuterVolumeSpecName: "config") pod "c68fa57d-b8c4-40fd-af10-efcd03b1720b" (UID: "c68fa57d-b8c4-40fd-af10-efcd03b1720b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:39.596266 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.596232 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "c68fa57d-b8c4-40fd-af10-efcd03b1720b" (UID: "c68fa57d-b8c4-40fd-af10-efcd03b1720b"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:39.604397 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.604369 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-web-config" (OuterVolumeSpecName: "web-config") pod "c68fa57d-b8c4-40fd-af10-efcd03b1720b" (UID: "c68fa57d-b8c4-40fd-af10-efcd03b1720b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:39.690120 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.690082 2561 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-grpc-tls\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:56:39.690120 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.690113 2561 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:56:39.690120 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.690125 2561 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-metrics-client-certs\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:56:39.690363 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.690137 2561 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:56:39.690363 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.690146 2561 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c68fa57d-b8c4-40fd-af10-efcd03b1720b-tls-assets\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:56:39.690363 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.690155 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h6lnw\" (UniqueName: \"kubernetes.io/projected/c68fa57d-b8c4-40fd-af10-efcd03b1720b-kube-api-access-h6lnw\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:56:39.690363 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.690164 2561 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-prometheus-k8s-tls\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:56:39.690363 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.690173 2561 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:56:39.690363 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.690182 2561 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:56:39.690363 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.690190 2561 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-web-config\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:56:39.690363 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.690199 2561 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-secret-kube-rbac-proxy\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:56:39.690363 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.690207 2561 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-configmap-metrics-client-ca\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:56:39.690363 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.690215 2561 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c68fa57d-b8c4-40fd-af10-efcd03b1720b-config-out\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:56:39.690363 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.690223 2561 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-thanos-prometheus-http-client-file\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:56:39.690363 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.690233 2561 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c68fa57d-b8c4-40fd-af10-efcd03b1720b-config\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:56:39.690363 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.690241 2561 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c68fa57d-b8c4-40fd-af10-efcd03b1720b-prometheus-k8s-db\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:56:39.690363 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:39.690274 2561 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c68fa57d-b8c4-40fd-af10-efcd03b1720b-prometheus-trusted-ca-bundle\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 19:56:40.405170 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.405134 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c68fa57d-b8c4-40fd-af10-efcd03b1720b","Type":"ContainerDied","Data":"089be0f16ebeafb362596215cd0b807a3cd1add3045fae0ee51e0b768e121fcc"} Apr 16 19:56:40.405586 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.405187 2561 scope.go:117] "RemoveContainer" containerID="e4231a4ba1cbe69cd584a5847bacf536c97d3cb62d9cbc7f524b553247721199" Apr 16 19:56:40.405586 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.405225 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.413993 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.413970 2561 scope.go:117] "RemoveContainer" containerID="cd6cddede3db107a6862fb048867838f3b7a6d8345797bb5f876184a6e884375" Apr 16 19:56:40.421188 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.421169 2561 scope.go:117] "RemoveContainer" containerID="d09eeadbebe65489aa7f47ddfc00f427d8eacba64f884a2eaecd25b2db56efca" Apr 16 19:56:40.427744 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.427727 2561 scope.go:117] "RemoveContainer" containerID="9329be1e586537d4eb80c7e943c1c94f6cecc9061c9619205b1556b62f0bbbfb" Apr 16 19:56:40.429446 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.429415 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:56:40.433375 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.433354 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:56:40.435376 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.435360 2561 scope.go:117] "RemoveContainer" containerID="a3359dfc38e226836220a04532848ad40c44315b706aea74e89fb77781b37085" Apr 16 19:56:40.441519 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.441497 2561 scope.go:117] "RemoveContainer" containerID="8e37cd21539a66ebb2fb9fde14484532fef4192f4ea135b8cfce6f289c6792cb" Apr 16 19:56:40.448117 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.448096 2561 scope.go:117] "RemoveContainer" containerID="ec8ee76d5209978c532b99d11876374eb889cdd4180d576efefbef242d33c458" Apr 16 19:56:40.456331 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456309 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:56:40.456572 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456560 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="kube-rbac-proxy-thanos" Apr 16 19:56:40.456619 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456573 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="kube-rbac-proxy-thanos" Apr 16 19:56:40.456619 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456583 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="config-reloader" Apr 16 19:56:40.456619 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456589 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="config-reloader" Apr 16 19:56:40.456619 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456599 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="init-config-reloader" Apr 16 19:56:40.456619 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456605 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="init-config-reloader" Apr 16 19:56:40.456619 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456612 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="thanos-sidecar" Apr 16 19:56:40.456619 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456617 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="thanos-sidecar" Apr 16 19:56:40.456619 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456623 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="prometheus" Apr 16 19:56:40.456877 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456628 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="prometheus" Apr 16 19:56:40.456877 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456635 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="kube-rbac-proxy" Apr 16 19:56:40.456877 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456640 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="kube-rbac-proxy" Apr 16 19:56:40.456877 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456647 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c7aae42-1d5a-4a39-8da1-c98f99b37fd1" containerName="registry" Apr 16 19:56:40.456877 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456652 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c7aae42-1d5a-4a39-8da1-c98f99b37fd1" containerName="registry" Apr 16 19:56:40.456877 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456657 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="kube-rbac-proxy-web" Apr 16 19:56:40.456877 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456661 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="kube-rbac-proxy-web" Apr 16 19:56:40.456877 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456703 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c7aae42-1d5a-4a39-8da1-c98f99b37fd1" containerName="registry" Apr 16 19:56:40.456877 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456711 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="prometheus" Apr 16 19:56:40.456877 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456717 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="kube-rbac-proxy-web" Apr 16 19:56:40.456877 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456722 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="kube-rbac-proxy-thanos" Apr 16 19:56:40.456877 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456728 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="config-reloader" Apr 16 19:56:40.456877 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456735 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="kube-rbac-proxy" Apr 16 19:56:40.456877 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.456741 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" containerName="thanos-sidecar" Apr 16 19:56:40.461655 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.461636 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.464622 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.464597 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 19:56:40.464780 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.464633 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 19:56:40.464857 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.464848 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 19:56:40.465058 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.465042 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 19:56:40.465127 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.465046 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 19:56:40.465363 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.465339 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 19:56:40.465486 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.465470 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 19:56:40.465846 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.465827 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-e5gcma5esfmoi\"" Apr 16 19:56:40.465929 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.465856 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 19:56:40.465929 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.465871 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-d6qch\"" Apr 16 19:56:40.466094 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.466075 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 19:56:40.466277 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.466239 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 19:56:40.466715 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.466697 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 19:56:40.467986 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.467957 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 19:56:40.471548 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.471525 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 19:56:40.473126 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.473105 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:56:40.496154 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.496129 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.496335 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.496162 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.496335 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.496188 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.496335 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.496223 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.496335 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.496273 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce028087-4681-47e2-bfea-e581780ae588-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.496335 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.496295 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce028087-4681-47e2-bfea-e581780ae588-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.496335 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.496331 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s89gq\" (UniqueName: \"kubernetes.io/projected/ce028087-4681-47e2-bfea-e581780ae588-kube-api-access-s89gq\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.496578 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.496378 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.496578 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.496405 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ce028087-4681-47e2-bfea-e581780ae588-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.496578 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.496435 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-web-config\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.496578 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.496467 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.496578 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.496496 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce028087-4681-47e2-bfea-e581780ae588-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.496578 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.496516 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.496578 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.496534 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-config\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.496578 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.496547 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce028087-4681-47e2-bfea-e581780ae588-config-out\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.496578 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.496564 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce028087-4681-47e2-bfea-e581780ae588-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.496874 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.496592 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ce028087-4681-47e2-bfea-e581780ae588-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.496874 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.496635 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce028087-4681-47e2-bfea-e581780ae588-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.597109 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.596999 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.597109 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.597055 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.597109 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.597094 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.597438 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.597120 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.597438 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.597145 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce028087-4681-47e2-bfea-e581780ae588-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.597438 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.597173 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce028087-4681-47e2-bfea-e581780ae588-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.597438 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.597201 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s89gq\" (UniqueName: \"kubernetes.io/projected/ce028087-4681-47e2-bfea-e581780ae588-kube-api-access-s89gq\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.597438 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.597227 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.597438 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.597277 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ce028087-4681-47e2-bfea-e581780ae588-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.597438 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.597311 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-web-config\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.597438 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.597339 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.597438 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.597369 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce028087-4681-47e2-bfea-e581780ae588-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.597438 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.597398 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.597438 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.597427 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-config\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.597961 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.597449 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce028087-4681-47e2-bfea-e581780ae588-config-out\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.597961 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.597477 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce028087-4681-47e2-bfea-e581780ae588-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.597961 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.597503 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ce028087-4681-47e2-bfea-e581780ae588-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.597961 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.597541 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce028087-4681-47e2-bfea-e581780ae588-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.598218 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.598191 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce028087-4681-47e2-bfea-e581780ae588-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.598306 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.598231 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce028087-4681-47e2-bfea-e581780ae588-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.598880 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.598854 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce028087-4681-47e2-bfea-e581780ae588-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.599803 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.599777 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ce028087-4681-47e2-bfea-e581780ae588-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.600088 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.600062 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce028087-4681-47e2-bfea-e581780ae588-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.600316 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.600294 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.600647 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.600624 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce028087-4681-47e2-bfea-e581780ae588-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.601059 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.601032 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.602229 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.602206 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.602380 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.602358 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-web-config\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.602695 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.602655 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce028087-4681-47e2-bfea-e581780ae588-config-out\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.602800 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.602743 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-config\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.602800 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.602749 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.602913 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.602891 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.602961 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.602926 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.603122 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.603103 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ce028087-4681-47e2-bfea-e581780ae588-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.603329 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.603309 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ce028087-4681-47e2-bfea-e581780ae588-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.605310 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.605294 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s89gq\" (UniqueName: \"kubernetes.io/projected/ce028087-4681-47e2-bfea-e581780ae588-kube-api-access-s89gq\") pod \"prometheus-k8s-0\" (UID: \"ce028087-4681-47e2-bfea-e581780ae588\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.772800 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.772758 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:40.898295 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:40.898205 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:56:40.901739 ip-10-0-130-116 kubenswrapper[2561]: W0416 19:56:40.901713 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce028087_4681_47e2_bfea_e581780ae588.slice/crio-093be2a31ac036ebfba6f706ad9adfa2db87f71c2e8154a66fbafae96b7333bd WatchSource:0}: Error finding container 093be2a31ac036ebfba6f706ad9adfa2db87f71c2e8154a66fbafae96b7333bd: Status 404 returned error can't find the container with id 093be2a31ac036ebfba6f706ad9adfa2db87f71c2e8154a66fbafae96b7333bd Apr 16 19:56:41.410479 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:41.410445 2561 generic.go:358] "Generic (PLEG): container finished" podID="ce028087-4681-47e2-bfea-e581780ae588" containerID="c7239f072a1dae1825805846f18f61e7cf79e36bf20c6e7a4a5926cd1377db1e" exitCode=0 Apr 16 19:56:41.410865 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:41.410489 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ce028087-4681-47e2-bfea-e581780ae588","Type":"ContainerDied","Data":"c7239f072a1dae1825805846f18f61e7cf79e36bf20c6e7a4a5926cd1377db1e"} Apr 16 19:56:41.410865 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:41.410508 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ce028087-4681-47e2-bfea-e581780ae588","Type":"ContainerStarted","Data":"093be2a31ac036ebfba6f706ad9adfa2db87f71c2e8154a66fbafae96b7333bd"} Apr 16 19:56:41.809081 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:41.809053 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c68fa57d-b8c4-40fd-af10-efcd03b1720b" path="/var/lib/kubelet/pods/c68fa57d-b8c4-40fd-af10-efcd03b1720b/volumes" Apr 16 19:56:42.418027 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:42.417990 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ce028087-4681-47e2-bfea-e581780ae588","Type":"ContainerStarted","Data":"318ef0564f7a61ac4131cee88c312f0f493ed9324d01eae7a96607f9c5d9fc5d"} Apr 16 19:56:42.418027 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:42.418029 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ce028087-4681-47e2-bfea-e581780ae588","Type":"ContainerStarted","Data":"c5f55d2b7986efc660074449356284ab1573d6f934e6c7014223e76ca0d79ea9"} Apr 16 19:56:42.418527 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:42.418041 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ce028087-4681-47e2-bfea-e581780ae588","Type":"ContainerStarted","Data":"c545ad1bed0e3c5f65f9764eff6495bfa38dc54aa20275bfab6acc07f3d19c4d"} Apr 16 19:56:42.418527 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:42.418054 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ce028087-4681-47e2-bfea-e581780ae588","Type":"ContainerStarted","Data":"b76c8a7bf9546e27bdd5140d6d76dfb7ca4618842825873d964ba8e872a02d81"} Apr 16 19:56:42.418527 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:42.418065 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ce028087-4681-47e2-bfea-e581780ae588","Type":"ContainerStarted","Data":"1f1c32e2c5f070d506be7c8c413d39fe0667095d016529143d8f3e9efe8db4ce"} Apr 16 19:56:42.418527 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:42.418075 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ce028087-4681-47e2-bfea-e581780ae588","Type":"ContainerStarted","Data":"07820b523b8594e8d8e22d280b9880dd801a827b4f1182a6f4353e5850a4e1e1"} Apr 16 19:56:42.448273 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:42.448198 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.448184197 podStartE2EDuration="2.448184197s" podCreationTimestamp="2026-04-16 19:56:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:56:42.445834318 +0000 UTC m=+165.285639720" watchObservedRunningTime="2026-04-16 19:56:42.448184197 +0000 UTC m=+165.287989597" Apr 16 19:56:45.773650 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:56:45.773609 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:40.773952 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:57:40.773917 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:40.789336 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:57:40.789304 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:41.612928 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:57:41.612893 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:58:57.657137 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:58:57.657105 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/ovn-acl-logging/0.log" Apr 16 19:58:57.657678 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:58:57.657185 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/ovn-acl-logging/0.log" Apr 16 19:58:57.660487 ip-10-0-130-116 kubenswrapper[2561]: I0416 19:58:57.660461 2561 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 20:02:10.466956 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:10.466924 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-w8rzf"] Apr 16 20:02:10.469953 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:10.469931 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-w8rzf" Apr 16 20:02:10.476845 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:10.476821 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 20:02:10.478313 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:10.478281 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-sshjd\"" Apr 16 20:02:10.478429 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:10.478319 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 20:02:10.478429 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:10.478331 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-w8rzf"] Apr 16 20:02:10.478429 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:10.478374 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 20:02:10.578395 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:10.578358 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhqqw\" (UniqueName: \"kubernetes.io/projected/c90feef0-9f0a-416a-83a3-fe7184863fce-kube-api-access-fhqqw\") pod \"s3-init-w8rzf\" (UID: \"c90feef0-9f0a-416a-83a3-fe7184863fce\") " pod="kserve/s3-init-w8rzf" Apr 16 20:02:10.679777 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:10.679720 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhqqw\" (UniqueName: \"kubernetes.io/projected/c90feef0-9f0a-416a-83a3-fe7184863fce-kube-api-access-fhqqw\") pod \"s3-init-w8rzf\" (UID: \"c90feef0-9f0a-416a-83a3-fe7184863fce\") " pod="kserve/s3-init-w8rzf" Apr 16 20:02:10.688807 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:10.688784 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhqqw\" (UniqueName: \"kubernetes.io/projected/c90feef0-9f0a-416a-83a3-fe7184863fce-kube-api-access-fhqqw\") pod \"s3-init-w8rzf\" (UID: \"c90feef0-9f0a-416a-83a3-fe7184863fce\") " pod="kserve/s3-init-w8rzf" Apr 16 20:02:10.792041 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:10.792010 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-w8rzf" Apr 16 20:02:10.908765 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:10.908739 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-w8rzf"] Apr 16 20:02:10.911171 ip-10-0-130-116 kubenswrapper[2561]: W0416 20:02:10.911131 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc90feef0_9f0a_416a_83a3_fe7184863fce.slice/crio-1428b5c8e7286ae6eded0627232d965d69952ffbed7e647ec8a87b5da7d9bc66 WatchSource:0}: Error finding container 1428b5c8e7286ae6eded0627232d965d69952ffbed7e647ec8a87b5da7d9bc66: Status 404 returned error can't find the container with id 1428b5c8e7286ae6eded0627232d965d69952ffbed7e647ec8a87b5da7d9bc66 Apr 16 20:02:10.913069 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:10.913047 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:02:11.351124 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:11.351068 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-w8rzf" event={"ID":"c90feef0-9f0a-416a-83a3-fe7184863fce","Type":"ContainerStarted","Data":"1428b5c8e7286ae6eded0627232d965d69952ffbed7e647ec8a87b5da7d9bc66"} Apr 16 20:02:16.369637 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:16.369600 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-w8rzf" event={"ID":"c90feef0-9f0a-416a-83a3-fe7184863fce","Type":"ContainerStarted","Data":"758c50f45a15bef55043724ed96bfecf85b39962a18c00eb6ecb234f715a8c4b"} Apr 16 20:02:16.386633 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:16.386579 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-w8rzf" podStartSLOduration=1.955088132 podStartE2EDuration="6.386564273s" podCreationTimestamp="2026-04-16 20:02:10 +0000 UTC" firstStartedPulling="2026-04-16 20:02:10.913229614 +0000 UTC m=+493.753034997" lastFinishedPulling="2026-04-16 20:02:15.344705749 +0000 UTC m=+498.184511138" observedRunningTime="2026-04-16 20:02:16.384848753 +0000 UTC m=+499.224654149" watchObservedRunningTime="2026-04-16 20:02:16.386564273 +0000 UTC m=+499.226369673" Apr 16 20:02:19.379355 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:19.379317 2561 generic.go:358] "Generic (PLEG): container finished" podID="c90feef0-9f0a-416a-83a3-fe7184863fce" containerID="758c50f45a15bef55043724ed96bfecf85b39962a18c00eb6ecb234f715a8c4b" exitCode=0 Apr 16 20:02:19.379753 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:19.379390 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-w8rzf" event={"ID":"c90feef0-9f0a-416a-83a3-fe7184863fce","Type":"ContainerDied","Data":"758c50f45a15bef55043724ed96bfecf85b39962a18c00eb6ecb234f715a8c4b"} Apr 16 20:02:20.502168 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:20.502144 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-w8rzf" Apr 16 20:02:20.670753 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:20.670657 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhqqw\" (UniqueName: \"kubernetes.io/projected/c90feef0-9f0a-416a-83a3-fe7184863fce-kube-api-access-fhqqw\") pod \"c90feef0-9f0a-416a-83a3-fe7184863fce\" (UID: \"c90feef0-9f0a-416a-83a3-fe7184863fce\") " Apr 16 20:02:20.672855 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:20.672831 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c90feef0-9f0a-416a-83a3-fe7184863fce-kube-api-access-fhqqw" (OuterVolumeSpecName: "kube-api-access-fhqqw") pod "c90feef0-9f0a-416a-83a3-fe7184863fce" (UID: "c90feef0-9f0a-416a-83a3-fe7184863fce"). InnerVolumeSpecName "kube-api-access-fhqqw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:02:20.772135 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:20.772097 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fhqqw\" (UniqueName: \"kubernetes.io/projected/c90feef0-9f0a-416a-83a3-fe7184863fce-kube-api-access-fhqqw\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 20:02:21.386614 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:21.386576 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-w8rzf" event={"ID":"c90feef0-9f0a-416a-83a3-fe7184863fce","Type":"ContainerDied","Data":"1428b5c8e7286ae6eded0627232d965d69952ffbed7e647ec8a87b5da7d9bc66"} Apr 16 20:02:21.386614 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:21.386598 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-w8rzf" Apr 16 20:02:21.386882 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:02:21.386610 2561 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1428b5c8e7286ae6eded0627232d965d69952ffbed7e647ec8a87b5da7d9bc66" Apr 16 20:03:57.676561 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:03:57.676533 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/ovn-acl-logging/0.log" Apr 16 20:03:57.678493 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:03:57.678474 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/ovn-acl-logging/0.log" Apr 16 20:08:57.694594 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:08:57.694518 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/ovn-acl-logging/0.log" Apr 16 20:08:57.697517 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:08:57.697497 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/ovn-acl-logging/0.log" Apr 16 20:13:57.713823 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:13:57.713794 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/ovn-acl-logging/0.log" Apr 16 20:13:57.717014 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:13:57.716994 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/ovn-acl-logging/0.log" Apr 16 20:16:11.717741 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:11.717662 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-964xz/must-gather-h2rx4"] Apr 16 20:16:11.718177 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:11.718002 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c90feef0-9f0a-416a-83a3-fe7184863fce" containerName="s3-init" Apr 16 20:16:11.718177 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:11.718019 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90feef0-9f0a-416a-83a3-fe7184863fce" containerName="s3-init" Apr 16 20:16:11.718177 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:11.718072 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="c90feef0-9f0a-416a-83a3-fe7184863fce" containerName="s3-init" Apr 16 20:16:11.721077 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:11.721056 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-964xz/must-gather-h2rx4" Apr 16 20:16:11.723874 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:11.723852 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-964xz\"/\"openshift-service-ca.crt\"" Apr 16 20:16:11.723970 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:11.723901 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-964xz\"/\"default-dockercfg-vhhrp\"" Apr 16 20:16:11.723970 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:11.723907 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-964xz\"/\"kube-root-ca.crt\"" Apr 16 20:16:11.729194 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:11.729167 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-964xz/must-gather-h2rx4"] Apr 16 20:16:11.771024 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:11.770994 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e77a93c2-c443-411d-9131-66238f83df31-must-gather-output\") pod \"must-gather-h2rx4\" (UID: \"e77a93c2-c443-411d-9131-66238f83df31\") " pod="openshift-must-gather-964xz/must-gather-h2rx4" Apr 16 20:16:11.771196 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:11.771034 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7kv4\" (UniqueName: \"kubernetes.io/projected/e77a93c2-c443-411d-9131-66238f83df31-kube-api-access-k7kv4\") pod \"must-gather-h2rx4\" (UID: \"e77a93c2-c443-411d-9131-66238f83df31\") " pod="openshift-must-gather-964xz/must-gather-h2rx4" Apr 16 20:16:11.872663 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:11.872631 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e77a93c2-c443-411d-9131-66238f83df31-must-gather-output\") pod \"must-gather-h2rx4\" (UID: \"e77a93c2-c443-411d-9131-66238f83df31\") " pod="openshift-must-gather-964xz/must-gather-h2rx4" Apr 16 20:16:11.872834 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:11.872677 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7kv4\" (UniqueName: \"kubernetes.io/projected/e77a93c2-c443-411d-9131-66238f83df31-kube-api-access-k7kv4\") pod \"must-gather-h2rx4\" (UID: \"e77a93c2-c443-411d-9131-66238f83df31\") " pod="openshift-must-gather-964xz/must-gather-h2rx4" Apr 16 20:16:11.872993 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:11.872971 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e77a93c2-c443-411d-9131-66238f83df31-must-gather-output\") pod \"must-gather-h2rx4\" (UID: \"e77a93c2-c443-411d-9131-66238f83df31\") " pod="openshift-must-gather-964xz/must-gather-h2rx4" Apr 16 20:16:11.880754 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:11.880730 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7kv4\" (UniqueName: \"kubernetes.io/projected/e77a93c2-c443-411d-9131-66238f83df31-kube-api-access-k7kv4\") pod \"must-gather-h2rx4\" (UID: \"e77a93c2-c443-411d-9131-66238f83df31\") " pod="openshift-must-gather-964xz/must-gather-h2rx4" Apr 16 20:16:12.044165 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:12.044129 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-964xz/must-gather-h2rx4" Apr 16 20:16:12.159593 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:12.159497 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-964xz/must-gather-h2rx4"] Apr 16 20:16:12.162194 ip-10-0-130-116 kubenswrapper[2561]: W0416 20:16:12.162161 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode77a93c2_c443_411d_9131_66238f83df31.slice/crio-409d0c03d6b48a45c9d2368e9a30789796bb2cb837cafb010ec4f95fab72b040 WatchSource:0}: Error finding container 409d0c03d6b48a45c9d2368e9a30789796bb2cb837cafb010ec4f95fab72b040: Status 404 returned error can't find the container with id 409d0c03d6b48a45c9d2368e9a30789796bb2cb837cafb010ec4f95fab72b040 Apr 16 20:16:12.163805 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:12.163788 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:16:12.702388 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:12.702353 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-964xz/must-gather-h2rx4" event={"ID":"e77a93c2-c443-411d-9131-66238f83df31","Type":"ContainerStarted","Data":"409d0c03d6b48a45c9d2368e9a30789796bb2cb837cafb010ec4f95fab72b040"} Apr 16 20:16:17.719197 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:17.719165 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-964xz/must-gather-h2rx4" event={"ID":"e77a93c2-c443-411d-9131-66238f83df31","Type":"ContainerStarted","Data":"44f5fdc32f2ba96e04d0b197b1981bf9c78bfcb9fa76008895d6ec17591d6c45"} Apr 16 20:16:17.719197 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:17.719198 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-964xz/must-gather-h2rx4" event={"ID":"e77a93c2-c443-411d-9131-66238f83df31","Type":"ContainerStarted","Data":"b943b045aca81edd57db5f95aa026593fb59dbb9ea6116f1db9351ed747c51f4"} Apr 16 20:16:17.735560 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:17.735501 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-964xz/must-gather-h2rx4" podStartSLOduration=2.1887154779999998 podStartE2EDuration="6.735485845s" podCreationTimestamp="2026-04-16 20:16:11 +0000 UTC" firstStartedPulling="2026-04-16 20:16:12.16396492 +0000 UTC m=+1335.003770301" lastFinishedPulling="2026-04-16 20:16:16.710735289 +0000 UTC m=+1339.550540668" observedRunningTime="2026-04-16 20:16:17.73388669 +0000 UTC m=+1340.573692093" watchObservedRunningTime="2026-04-16 20:16:17.735485845 +0000 UTC m=+1340.575291246" Apr 16 20:16:36.777653 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:36.777615 2561 generic.go:358] "Generic (PLEG): container finished" podID="e77a93c2-c443-411d-9131-66238f83df31" containerID="b943b045aca81edd57db5f95aa026593fb59dbb9ea6116f1db9351ed747c51f4" exitCode=0 Apr 16 20:16:36.777653 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:36.777655 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-964xz/must-gather-h2rx4" event={"ID":"e77a93c2-c443-411d-9131-66238f83df31","Type":"ContainerDied","Data":"b943b045aca81edd57db5f95aa026593fb59dbb9ea6116f1db9351ed747c51f4"} Apr 16 20:16:36.778084 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:36.777957 2561 scope.go:117] "RemoveContainer" containerID="b943b045aca81edd57db5f95aa026593fb59dbb9ea6116f1db9351ed747c51f4" Apr 16 20:16:37.192915 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:37.192829 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-964xz_must-gather-h2rx4_e77a93c2-c443-411d-9131-66238f83df31/gather/0.log" Apr 16 20:16:37.742132 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:37.742091 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h9spp/must-gather-d297v"] Apr 16 20:16:37.744554 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:37.744537 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9spp/must-gather-d297v" Apr 16 20:16:37.747456 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:37.747431 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h9spp\"/\"openshift-service-ca.crt\"" Apr 16 20:16:37.747573 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:37.747431 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h9spp\"/\"kube-root-ca.crt\"" Apr 16 20:16:37.747573 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:37.747434 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-h9spp\"/\"default-dockercfg-lq8qc\"" Apr 16 20:16:37.752003 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:37.751979 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h9spp/must-gather-d297v"] Apr 16 20:16:37.898034 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:37.898000 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e8b88adf-e4c6-4f20-8b5a-1783c11b1ac9-must-gather-output\") pod \"must-gather-d297v\" (UID: \"e8b88adf-e4c6-4f20-8b5a-1783c11b1ac9\") " pod="openshift-must-gather-h9spp/must-gather-d297v" Apr 16 20:16:37.898431 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:37.898050 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wm45\" (UniqueName: \"kubernetes.io/projected/e8b88adf-e4c6-4f20-8b5a-1783c11b1ac9-kube-api-access-7wm45\") pod \"must-gather-d297v\" (UID: \"e8b88adf-e4c6-4f20-8b5a-1783c11b1ac9\") " pod="openshift-must-gather-h9spp/must-gather-d297v" Apr 16 20:16:37.998815 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:37.998744 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e8b88adf-e4c6-4f20-8b5a-1783c11b1ac9-must-gather-output\") pod \"must-gather-d297v\" (UID: \"e8b88adf-e4c6-4f20-8b5a-1783c11b1ac9\") " pod="openshift-must-gather-h9spp/must-gather-d297v" Apr 16 20:16:37.998815 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:37.998793 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wm45\" (UniqueName: \"kubernetes.io/projected/e8b88adf-e4c6-4f20-8b5a-1783c11b1ac9-kube-api-access-7wm45\") pod \"must-gather-d297v\" (UID: \"e8b88adf-e4c6-4f20-8b5a-1783c11b1ac9\") " pod="openshift-must-gather-h9spp/must-gather-d297v" Apr 16 20:16:37.999083 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:37.999065 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e8b88adf-e4c6-4f20-8b5a-1783c11b1ac9-must-gather-output\") pod \"must-gather-d297v\" (UID: \"e8b88adf-e4c6-4f20-8b5a-1783c11b1ac9\") " pod="openshift-must-gather-h9spp/must-gather-d297v" Apr 16 20:16:38.007759 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:38.007728 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wm45\" (UniqueName: \"kubernetes.io/projected/e8b88adf-e4c6-4f20-8b5a-1783c11b1ac9-kube-api-access-7wm45\") pod \"must-gather-d297v\" (UID: \"e8b88adf-e4c6-4f20-8b5a-1783c11b1ac9\") " pod="openshift-must-gather-h9spp/must-gather-d297v" Apr 16 20:16:38.054404 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:38.054378 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9spp/must-gather-d297v" Apr 16 20:16:38.172493 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:38.172467 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h9spp/must-gather-d297v"] Apr 16 20:16:38.177495 ip-10-0-130-116 kubenswrapper[2561]: W0416 20:16:38.175213 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8b88adf_e4c6_4f20_8b5a_1783c11b1ac9.slice/crio-f4392709cfba8f9b9d49d7240fd2b27ad77a7ce77e332ddf12574e1825253b17 WatchSource:0}: Error finding container f4392709cfba8f9b9d49d7240fd2b27ad77a7ce77e332ddf12574e1825253b17: Status 404 returned error can't find the container with id f4392709cfba8f9b9d49d7240fd2b27ad77a7ce77e332ddf12574e1825253b17 Apr 16 20:16:38.784783 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:38.784724 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9spp/must-gather-d297v" event={"ID":"e8b88adf-e4c6-4f20-8b5a-1783c11b1ac9","Type":"ContainerStarted","Data":"f4392709cfba8f9b9d49d7240fd2b27ad77a7ce77e332ddf12574e1825253b17"} Apr 16 20:16:39.791141 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:39.791099 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9spp/must-gather-d297v" event={"ID":"e8b88adf-e4c6-4f20-8b5a-1783c11b1ac9","Type":"ContainerStarted","Data":"51d2b458117fbdb5a59e59e359edbea00824ba2118a14ce6e6f0808a1da73dff"} Apr 16 20:16:39.791141 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:39.791149 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9spp/must-gather-d297v" event={"ID":"e8b88adf-e4c6-4f20-8b5a-1783c11b1ac9","Type":"ContainerStarted","Data":"c610219a189230275d622b8ce7fa9f154acd2267e6046a1263d91597edc74820"} Apr 16 20:16:39.806629 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:39.806575 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h9spp/must-gather-d297v" podStartSLOduration=2.08878251 podStartE2EDuration="2.806556419s" podCreationTimestamp="2026-04-16 20:16:37 +0000 UTC" firstStartedPulling="2026-04-16 20:16:38.180209334 +0000 UTC m=+1361.020014714" lastFinishedPulling="2026-04-16 20:16:38.897983243 +0000 UTC m=+1361.737788623" observedRunningTime="2026-04-16 20:16:39.806501985 +0000 UTC m=+1362.646307400" watchObservedRunningTime="2026-04-16 20:16:39.806556419 +0000 UTC m=+1362.646361820" Apr 16 20:16:40.285896 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:40.285860 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-z89qz_3ada28f2-9643-4885-b86d-53b74a05e6a5/global-pull-secret-syncer/0.log" Apr 16 20:16:40.384610 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:40.384577 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-67lnf_2b2bea17-ec01-4d33-b497-41bb92b91043/konnectivity-agent/0.log" Apr 16 20:16:40.465323 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:40.465293 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-116.ec2.internal_0eb38fade52ea6b8de006afe3c20412c/haproxy/0.log" Apr 16 20:16:42.576832 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:42.576785 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-964xz/must-gather-h2rx4"] Apr 16 20:16:42.577481 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:42.577074 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-964xz/must-gather-h2rx4" podUID="e77a93c2-c443-411d-9131-66238f83df31" containerName="copy" containerID="cri-o://44f5fdc32f2ba96e04d0b197b1981bf9c78bfcb9fa76008895d6ec17591d6c45" gracePeriod=2 Apr 16 20:16:42.580401 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:42.580370 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-964xz/must-gather-h2rx4"] Apr 16 20:16:42.580598 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:42.580570 2561 status_manager.go:895] "Failed to get status for pod" podUID="e77a93c2-c443-411d-9131-66238f83df31" pod="openshift-must-gather-964xz/must-gather-h2rx4" err="pods \"must-gather-h2rx4\" is forbidden: User \"system:node:ip-10-0-130-116.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-964xz\": no relationship found between node 'ip-10-0-130-116.ec2.internal' and this object" Apr 16 20:16:42.847610 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:42.847480 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-964xz_must-gather-h2rx4_e77a93c2-c443-411d-9131-66238f83df31/copy/0.log" Apr 16 20:16:42.858914 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:42.858791 2561 generic.go:358] "Generic (PLEG): container finished" podID="e77a93c2-c443-411d-9131-66238f83df31" containerID="44f5fdc32f2ba96e04d0b197b1981bf9c78bfcb9fa76008895d6ec17591d6c45" exitCode=143 Apr 16 20:16:42.930039 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:42.929764 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-964xz_must-gather-h2rx4_e77a93c2-c443-411d-9131-66238f83df31/copy/0.log" Apr 16 20:16:42.930706 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:42.930399 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-964xz/must-gather-h2rx4" Apr 16 20:16:42.934531 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:42.934440 2561 status_manager.go:895] "Failed to get status for pod" podUID="e77a93c2-c443-411d-9131-66238f83df31" pod="openshift-must-gather-964xz/must-gather-h2rx4" err="pods \"must-gather-h2rx4\" is forbidden: User \"system:node:ip-10-0-130-116.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-964xz\": no relationship found between node 'ip-10-0-130-116.ec2.internal' and this object" Apr 16 20:16:43.045329 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:43.045289 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7kv4\" (UniqueName: \"kubernetes.io/projected/e77a93c2-c443-411d-9131-66238f83df31-kube-api-access-k7kv4\") pod \"e77a93c2-c443-411d-9131-66238f83df31\" (UID: \"e77a93c2-c443-411d-9131-66238f83df31\") " Apr 16 20:16:43.045511 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:43.045380 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e77a93c2-c443-411d-9131-66238f83df31-must-gather-output\") pod \"e77a93c2-c443-411d-9131-66238f83df31\" (UID: \"e77a93c2-c443-411d-9131-66238f83df31\") " Apr 16 20:16:43.046797 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:43.046763 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e77a93c2-c443-411d-9131-66238f83df31-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e77a93c2-c443-411d-9131-66238f83df31" (UID: "e77a93c2-c443-411d-9131-66238f83df31"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:16:43.049369 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:43.049338 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e77a93c2-c443-411d-9131-66238f83df31-kube-api-access-k7kv4" (OuterVolumeSpecName: "kube-api-access-k7kv4") pod "e77a93c2-c443-411d-9131-66238f83df31" (UID: "e77a93c2-c443-411d-9131-66238f83df31"). InnerVolumeSpecName "kube-api-access-k7kv4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:16:43.147209 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:43.146309 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k7kv4\" (UniqueName: \"kubernetes.io/projected/e77a93c2-c443-411d-9131-66238f83df31-kube-api-access-k7kv4\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 20:16:43.147209 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:43.146357 2561 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e77a93c2-c443-411d-9131-66238f83df31-must-gather-output\") on node \"ip-10-0-130-116.ec2.internal\" DevicePath \"\"" Apr 16 20:16:43.811293 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:43.811241 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e77a93c2-c443-411d-9131-66238f83df31" path="/var/lib/kubelet/pods/e77a93c2-c443-411d-9131-66238f83df31/volumes" Apr 16 20:16:43.865511 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:43.865478 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-964xz_must-gather-h2rx4_e77a93c2-c443-411d-9131-66238f83df31/copy/0.log" Apr 16 20:16:43.865936 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:43.865915 2561 scope.go:117] "RemoveContainer" containerID="44f5fdc32f2ba96e04d0b197b1981bf9c78bfcb9fa76008895d6ec17591d6c45" Apr 16 20:16:43.866077 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:43.866060 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-964xz/must-gather-h2rx4" Apr 16 20:16:43.879181 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:43.879159 2561 scope.go:117] "RemoveContainer" containerID="b943b045aca81edd57db5f95aa026593fb59dbb9ea6116f1db9351ed747c51f4" Apr 16 20:16:44.178063 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:44.178026 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xjz7t_c5f736cf-45d9-4458-9bc5-c8de66533a6b/kube-state-metrics/0.log" Apr 16 20:16:44.203487 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:44.203447 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xjz7t_c5f736cf-45d9-4458-9bc5-c8de66533a6b/kube-rbac-proxy-main/0.log" Apr 16 20:16:44.227117 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:44.227092 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xjz7t_c5f736cf-45d9-4458-9bc5-c8de66533a6b/kube-rbac-proxy-self/0.log" Apr 16 20:16:44.284037 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:44.284000 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-fpvmn_943b61e6-248b-4330-9e80-81ac4a4dc9cd/monitoring-plugin/0.log" Apr 16 20:16:44.475232 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:44.475142 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v9qhq_757fd1bb-7b91-41e1-9d8e-881aa2286adc/node-exporter/0.log" Apr 16 20:16:44.497283 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:44.497237 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v9qhq_757fd1bb-7b91-41e1-9d8e-881aa2286adc/kube-rbac-proxy/0.log" Apr 16 20:16:44.519084 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:44.519051 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v9qhq_757fd1bb-7b91-41e1-9d8e-881aa2286adc/init-textfile/0.log" Apr 16 20:16:44.645231 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:44.645200 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ce028087-4681-47e2-bfea-e581780ae588/prometheus/0.log" Apr 16 20:16:44.664528 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:44.664500 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ce028087-4681-47e2-bfea-e581780ae588/config-reloader/0.log" Apr 16 20:16:44.688637 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:44.688608 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ce028087-4681-47e2-bfea-e581780ae588/thanos-sidecar/0.log" Apr 16 20:16:44.710089 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:44.710062 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ce028087-4681-47e2-bfea-e581780ae588/kube-rbac-proxy-web/0.log" Apr 16 20:16:44.730930 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:44.730852 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ce028087-4681-47e2-bfea-e581780ae588/kube-rbac-proxy/0.log" Apr 16 20:16:44.752162 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:44.752129 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ce028087-4681-47e2-bfea-e581780ae588/kube-rbac-proxy-thanos/0.log" Apr 16 20:16:44.772830 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:44.772793 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ce028087-4681-47e2-bfea-e581780ae588/init-config-reloader/0.log" Apr 16 20:16:44.800906 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:44.800877 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-hb5qb_5b9eafb4-5fb9-437a-8757-c4b78d699f1f/prometheus-operator/0.log" Apr 16 20:16:44.817552 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:44.817523 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-hb5qb_5b9eafb4-5fb9-437a-8757-c4b78d699f1f/kube-rbac-proxy/0.log" Apr 16 20:16:44.840724 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:44.840671 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-qjmfn_f95ab465-dbfa-421d-a505-279bdea9be8c/prometheus-operator-admission-webhook/0.log" Apr 16 20:16:47.559693 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.559660 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr"] Apr 16 20:16:47.560683 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.560660 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e77a93c2-c443-411d-9131-66238f83df31" containerName="gather" Apr 16 20:16:47.560830 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.560819 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="e77a93c2-c443-411d-9131-66238f83df31" containerName="gather" Apr 16 20:16:47.560920 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.560911 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e77a93c2-c443-411d-9131-66238f83df31" containerName="copy" Apr 16 20:16:47.560988 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.560980 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="e77a93c2-c443-411d-9131-66238f83df31" containerName="copy" Apr 16 20:16:47.561159 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.561148 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="e77a93c2-c443-411d-9131-66238f83df31" containerName="gather" Apr 16 20:16:47.561232 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.561224 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="e77a93c2-c443-411d-9131-66238f83df31" containerName="copy" Apr 16 20:16:47.564664 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.564640 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:47.573146 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.573122 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr"] Apr 16 20:16:47.686642 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.686606 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/def11036-5fdc-4e3f-9982-59e8046761ed-podres\") pod \"perf-node-gather-daemonset-2q5pr\" (UID: \"def11036-5fdc-4e3f-9982-59e8046761ed\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:47.686808 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.686668 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/def11036-5fdc-4e3f-9982-59e8046761ed-sys\") pod \"perf-node-gather-daemonset-2q5pr\" (UID: \"def11036-5fdc-4e3f-9982-59e8046761ed\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:47.686808 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.686714 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/def11036-5fdc-4e3f-9982-59e8046761ed-proc\") pod \"perf-node-gather-daemonset-2q5pr\" (UID: \"def11036-5fdc-4e3f-9982-59e8046761ed\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:47.686808 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.686738 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/def11036-5fdc-4e3f-9982-59e8046761ed-lib-modules\") pod \"perf-node-gather-daemonset-2q5pr\" (UID: \"def11036-5fdc-4e3f-9982-59e8046761ed\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:47.686808 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.686790 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc5cb\" (UniqueName: \"kubernetes.io/projected/def11036-5fdc-4e3f-9982-59e8046761ed-kube-api-access-wc5cb\") pod \"perf-node-gather-daemonset-2q5pr\" (UID: \"def11036-5fdc-4e3f-9982-59e8046761ed\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:47.787988 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.787951 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/def11036-5fdc-4e3f-9982-59e8046761ed-proc\") pod \"perf-node-gather-daemonset-2q5pr\" (UID: \"def11036-5fdc-4e3f-9982-59e8046761ed\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:47.788265 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.788229 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/def11036-5fdc-4e3f-9982-59e8046761ed-lib-modules\") pod \"perf-node-gather-daemonset-2q5pr\" (UID: \"def11036-5fdc-4e3f-9982-59e8046761ed\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:47.788452 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.788437 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc5cb\" (UniqueName: \"kubernetes.io/projected/def11036-5fdc-4e3f-9982-59e8046761ed-kube-api-access-wc5cb\") pod \"perf-node-gather-daemonset-2q5pr\" (UID: \"def11036-5fdc-4e3f-9982-59e8046761ed\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:47.788608 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.788587 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/def11036-5fdc-4e3f-9982-59e8046761ed-podres\") pod \"perf-node-gather-daemonset-2q5pr\" (UID: \"def11036-5fdc-4e3f-9982-59e8046761ed\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:47.788738 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.788725 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/def11036-5fdc-4e3f-9982-59e8046761ed-sys\") pod \"perf-node-gather-daemonset-2q5pr\" (UID: \"def11036-5fdc-4e3f-9982-59e8046761ed\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:47.789797 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.789769 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/def11036-5fdc-4e3f-9982-59e8046761ed-proc\") pod \"perf-node-gather-daemonset-2q5pr\" (UID: \"def11036-5fdc-4e3f-9982-59e8046761ed\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:47.790048 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.790032 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/def11036-5fdc-4e3f-9982-59e8046761ed-lib-modules\") pod \"perf-node-gather-daemonset-2q5pr\" (UID: \"def11036-5fdc-4e3f-9982-59e8046761ed\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:47.790589 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.790570 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/def11036-5fdc-4e3f-9982-59e8046761ed-podres\") pod \"perf-node-gather-daemonset-2q5pr\" (UID: \"def11036-5fdc-4e3f-9982-59e8046761ed\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:47.790779 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.790751 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/def11036-5fdc-4e3f-9982-59e8046761ed-sys\") pod \"perf-node-gather-daemonset-2q5pr\" (UID: \"def11036-5fdc-4e3f-9982-59e8046761ed\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:47.799095 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.799068 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc5cb\" (UniqueName: \"kubernetes.io/projected/def11036-5fdc-4e3f-9982-59e8046761ed-kube-api-access-wc5cb\") pod \"perf-node-gather-daemonset-2q5pr\" (UID: \"def11036-5fdc-4e3f-9982-59e8046761ed\") " pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:47.877494 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:47.877418 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:48.023928 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:48.023898 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr"] Apr 16 20:16:48.034001 ip-10-0-130-116 kubenswrapper[2561]: W0416 20:16:48.033968 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddef11036_5fdc_4e3f_9982_59e8046761ed.slice/crio-3fae16890912cd75f1981f2ceb05c829a38abe038565ad67d09131834eee5a40 WatchSource:0}: Error finding container 3fae16890912cd75f1981f2ceb05c829a38abe038565ad67d09131834eee5a40: Status 404 returned error can't find the container with id 3fae16890912cd75f1981f2ceb05c829a38abe038565ad67d09131834eee5a40 Apr 16 20:16:48.386539 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:48.386465 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5jw6k_28d7c952-6b2a-4308-acb9-9864f2a7d6dc/dns/0.log" Apr 16 20:16:48.405903 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:48.405880 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5jw6k_28d7c952-6b2a-4308-acb9-9864f2a7d6dc/kube-rbac-proxy/0.log" Apr 16 20:16:48.565813 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:48.565787 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-xlktd_487cb337-b28d-43b2-8430-bacf43a71449/dns-node-resolver/0.log" Apr 16 20:16:48.889276 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:48.889225 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" event={"ID":"def11036-5fdc-4e3f-9982-59e8046761ed","Type":"ContainerStarted","Data":"2b01728eddbe6842f0ea276a582676994092e470dfa50549664605d3e85d0f17"} Apr 16 20:16:48.889449 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:48.889285 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" event={"ID":"def11036-5fdc-4e3f-9982-59e8046761ed","Type":"ContainerStarted","Data":"3fae16890912cd75f1981f2ceb05c829a38abe038565ad67d09131834eee5a40"} Apr 16 20:16:48.889449 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:48.889382 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:48.906869 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:48.906811 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" podStartSLOduration=1.906795844 podStartE2EDuration="1.906795844s" podCreationTimestamp="2026-04-16 20:16:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:16:48.905056748 +0000 UTC m=+1371.744862150" watchObservedRunningTime="2026-04-16 20:16:48.906795844 +0000 UTC m=+1371.746601245" Apr 16 20:16:48.983199 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:48.983177 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-69647495b-szrmb_43164e36-45f6-4070-9897-95da9982dd10/registry/0.log" Apr 16 20:16:49.035162 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:49.035133 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kzf5d_f0fbaacd-1e65-4ba6-a6df-3fd552c3bb55/node-ca/0.log" Apr 16 20:16:50.108195 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:50.108147 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rfcgq_2a59db72-acc4-42f2-934a-6582522efbbc/serve-healthcheck-canary/0.log" Apr 16 20:16:50.530267 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:50.530211 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x64wb_b2af1db9-97e2-4754-951b-1299f4f7a507/kube-rbac-proxy/0.log" Apr 16 20:16:50.548626 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:50.548602 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x64wb_b2af1db9-97e2-4754-951b-1299f4f7a507/exporter/0.log" Apr 16 20:16:50.566501 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:50.566478 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x64wb_b2af1db9-97e2-4754-951b-1299f4f7a507/extractor/0.log" Apr 16 20:16:52.693968 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:52.693936 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-w8rzf_c90feef0-9f0a-416a-83a3-fe7184863fce/s3-init/0.log" Apr 16 20:16:54.902053 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:54.902029 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-h9spp/perf-node-gather-daemonset-2q5pr" Apr 16 20:16:56.626037 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:56.625967 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-w52n5_c0fa525c-6b9b-4e3d-98b7-fd18c7a4c4bc/migrator/0.log" Apr 16 20:16:56.647536 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:56.647508 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-w52n5_c0fa525c-6b9b-4e3d-98b7-fd18c7a4c4bc/graceful-termination/0.log" Apr 16 20:16:57.979182 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:57.979151 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4s7gc_a3b69b87-4fb1-45dc-ba8f-f3e5ee3ef11b/kube-multus/0.log" Apr 16 20:16:58.321562 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:58.321530 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w96qg_d12d0814-9c34-4bb1-b975-721e0ecd4752/kube-multus-additional-cni-plugins/0.log" Apr 16 20:16:58.342180 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:58.342152 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w96qg_d12d0814-9c34-4bb1-b975-721e0ecd4752/egress-router-binary-copy/0.log" Apr 16 20:16:58.367618 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:58.367593 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w96qg_d12d0814-9c34-4bb1-b975-721e0ecd4752/cni-plugins/0.log" Apr 16 20:16:58.390654 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:58.390631 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w96qg_d12d0814-9c34-4bb1-b975-721e0ecd4752/bond-cni-plugin/0.log" Apr 16 20:16:58.413864 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:58.413837 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w96qg_d12d0814-9c34-4bb1-b975-721e0ecd4752/routeoverride-cni/0.log" Apr 16 20:16:58.441680 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:58.441647 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w96qg_d12d0814-9c34-4bb1-b975-721e0ecd4752/whereabouts-cni-bincopy/0.log" Apr 16 20:16:58.461490 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:58.461463 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w96qg_d12d0814-9c34-4bb1-b975-721e0ecd4752/whereabouts-cni/0.log" Apr 16 20:16:58.591984 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:58.591888 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vx6n5_5b25a2cc-2c1d-4c3d-93ff-f73223624d78/network-metrics-daemon/0.log" Apr 16 20:16:58.612020 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:58.611976 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vx6n5_5b25a2cc-2c1d-4c3d-93ff-f73223624d78/kube-rbac-proxy/0.log" Apr 16 20:16:59.450905 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:59.450825 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/ovn-controller/0.log" Apr 16 20:16:59.468304 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:59.468265 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/ovn-acl-logging/0.log" Apr 16 20:16:59.480603 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:59.480571 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/ovn-acl-logging/1.log" Apr 16 20:16:59.499705 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:59.499683 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/kube-rbac-proxy-node/0.log" Apr 16 20:16:59.523490 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:59.523467 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 20:16:59.543409 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:59.543385 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/northd/0.log" Apr 16 20:16:59.561928 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:59.561905 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/nbdb/0.log" Apr 16 20:16:59.580694 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:59.580670 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/sbdb/0.log" Apr 16 20:16:59.759260 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:16:59.759173 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2px7v_e71bfafb-1a55-495e-afc2-a4ffd47dedea/ovnkube-controller/0.log" Apr 16 20:17:01.371699 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:17:01.371673 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qq6lx_10a9f71c-fe60-4983-820b-1a8007ba1863/network-check-target-container/0.log" Apr 16 20:17:02.288363 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:17:02.288333 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-z5st5_95692ec0-fe71-4218-8b62-1622b9caabaa/iptables-alerter/0.log" Apr 16 20:17:02.862671 ip-10-0-130-116 kubenswrapper[2561]: I0416 20:17:02.862643 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-d2hvw_c76d029a-2a7e-46c6-a287-817cea8544c3/tuned/0.log"