Apr 16 18:27:45.016569 ip-10-0-129-166 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 18:27:45.016581 ip-10-0-129-166 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 18:27:45.016588 ip-10-0-129-166 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 18:27:45.016861 ip-10-0-129-166 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 18:27:55.057306 ip-10-0-129-166 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 18:27:55.057320 ip-10-0-129-166 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 0cec42f137184301bab1767eefc8f174 -- Apr 16 18:30:00.171207 ip-10-0-129-166 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:30:00.588811 ip-10-0-129-166 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:30:00.588811 ip-10-0-129-166 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:30:00.588811 ip-10-0-129-166 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:30:00.588811 ip-10-0-129-166 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:30:00.588811 ip-10-0-129-166 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:30:00.591060 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.590902 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:30:00.593839 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593822 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:00.593839 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593839 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593844 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593847 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593850 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593853 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593856 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593859 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593862 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593864 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593867 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593870 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593872 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593880 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593883 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593888 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593892 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593895 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593898 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593901 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593905 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:00.593900 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593908 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593911 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593915 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593930 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593933 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593936 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593939 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593942 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593944 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593948 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593950 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593953 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593956 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593959 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593962 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593964 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593967 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593969 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593972 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593974 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:00.594387 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593977 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593979 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593983 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593986 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593988 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593991 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593993 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593996 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.593998 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594001 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594003 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594006 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594008 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594011 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594014 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594017 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594020 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594022 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594027 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594031 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:00.594872 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594034 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594036 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594039 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594041 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594044 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594046 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594049 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594052 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594054 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594057 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594059 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594062 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594065 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594068 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594070 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594074 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594077 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594079 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594082 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:00.595370 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594084 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594087 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594089 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594092 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594095 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594097 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594478 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594483 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594487 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594489 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594493 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594496 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594499 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594502 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594504 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594507 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594510 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594512 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594515 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594517 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:00.595822 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594520 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594523 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594525 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594528 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594530 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594533 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594535 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594538 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594541 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594543 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594546 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594549 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594552 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594555 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594558 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594560 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594563 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594565 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594569 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:00.596306 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594573 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594576 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594578 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594581 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594583 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594586 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594589 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594591 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594594 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594596 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594599 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594602 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594604 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594607 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594609 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594612 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594615 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594617 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594621 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594625 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:00.596769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594628 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594631 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594634 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594636 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594639 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594642 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594645 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594647 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594650 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594652 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594655 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594657 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594660 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594663 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594667 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594669 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594672 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594674 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594677 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:00.597272 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594680 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594682 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594685 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594687 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594690 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594692 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594695 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594698 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594700 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594703 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594705 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594708 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594710 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.594713 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595491 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595500 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595508 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595512 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595518 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595521 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595526 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:30:00.597737 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595530 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595534 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595537 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595541 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595545 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595548 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595551 2570 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595554 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595557 2570 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595561 2570 flags.go:64] FLAG: --cloud-config="" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595563 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595567 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595571 2570 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595574 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595577 2570 flags.go:64] FLAG: --config-dir="" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595580 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595584 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595592 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595595 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595598 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595602 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595605 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595608 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595611 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595614 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:30:00.598281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595617 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595622 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595625 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595628 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595631 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595635 2570 flags.go:64] FLAG: --enable-server="true" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595638 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595643 2570 flags.go:64] FLAG: --event-burst="100" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595646 2570 flags.go:64] FLAG: --event-qps="50" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595649 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595652 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595655 2570 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595660 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595663 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595666 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595669 2570 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595672 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595675 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595677 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595681 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595684 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595687 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595689 2570 flags.go:64] FLAG: --feature-gates="" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595693 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595696 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:30:00.598877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595699 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595703 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595706 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595710 2570 flags.go:64] FLAG: --help="false" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595713 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-129-166.ec2.internal" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595716 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595719 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595722 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595725 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595729 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595732 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595735 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595738 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595741 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595744 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595748 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595750 2570 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595754 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595757 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595760 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595763 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595766 2570 flags.go:64] FLAG: --lock-file="" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595769 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595772 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:30:00.599594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595775 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595780 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595783 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595786 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595789 2570 flags.go:64] FLAG: --logging-format="text" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595792 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595795 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595798 2570 flags.go:64] FLAG: --manifest-url="" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595801 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595805 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595809 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595813 2570 flags.go:64] FLAG: --max-pods="110" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595816 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595819 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595822 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595825 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595828 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595831 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595835 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595842 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595845 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595848 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595852 2570 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:30:00.600192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595855 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595860 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595863 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595867 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595870 2570 flags.go:64] FLAG: --port="10250" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595873 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595876 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-07fe4b69233e533be" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595879 2570 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595882 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595885 2570 flags.go:64] FLAG: --register-node="true" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595888 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595891 2570 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595894 2570 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595898 2570 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595901 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595904 2570 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595908 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595911 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595914 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595928 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595931 2570 flags.go:64] FLAG: --runonce="false" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595934 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595937 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595940 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595943 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595946 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:30:00.600739 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595950 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595953 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595956 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595959 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595962 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595965 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595968 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595971 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595974 2570 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595977 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595982 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595985 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595987 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595992 2570 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595994 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.595997 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.596000 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.596003 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.596006 2570 flags.go:64] FLAG: --v="2" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.596010 2570 flags.go:64] FLAG: --version="false" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.596014 2570 flags.go:64] FLAG: --vmodule="" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.596018 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.596021 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596117 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596121 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:00.601390 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596125 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596128 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596131 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596134 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596136 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596139 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596141 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596144 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596147 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596154 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596156 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596159 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596162 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596165 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596167 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596170 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596173 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596175 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596178 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596180 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:00.602003 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596183 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596186 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596189 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596192 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596194 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596197 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596200 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596202 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596205 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596208 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596211 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596214 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596217 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596220 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596222 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596225 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596227 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596230 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596232 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596235 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:00.602498 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596238 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596242 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596245 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596247 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596250 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596252 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596255 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596258 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596260 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596263 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596265 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596268 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596271 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596273 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596276 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596278 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596281 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596283 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596286 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:00.603010 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596289 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596291 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596294 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596298 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596302 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596304 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596307 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596310 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596313 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596316 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596318 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596321 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596324 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596326 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596330 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596333 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596335 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596338 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596340 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:00.603478 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596343 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:00.604231 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596347 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:00.604231 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596351 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:00.604231 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596354 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:00.604231 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596357 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:00.604231 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.596360 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:00.604231 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.596954 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:30:00.605794 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.605771 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:30:00.605794 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.605792 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605863 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605872 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605876 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605881 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605886 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605890 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605894 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605898 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605902 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605906 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605912 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605929 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605934 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605938 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605943 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605947 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605951 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605955 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605959 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:00.605967 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605963 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605967 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605971 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605976 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605980 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605984 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605989 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605993 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.605998 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606001 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606006 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606012 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606017 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606021 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606025 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606029 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606033 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606037 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606044 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:00.606886 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606051 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606055 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606059 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606064 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606069 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606074 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606078 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606083 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606087 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606092 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606097 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606101 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606106 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606110 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606114 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606119 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606123 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606128 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606132 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606137 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:00.607601 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606141 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606145 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606149 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606154 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606160 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606165 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606170 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606174 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606179 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606183 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606187 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606191 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606195 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606200 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606204 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606208 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606213 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606217 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606223 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606228 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:00.608441 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606232 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:00.609283 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606237 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:00.609283 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606241 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:00.609283 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606246 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:00.609283 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606250 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:00.609283 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606254 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:00.609283 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606258 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:00.609283 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606262 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:00.609283 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.606270 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:30:00.609283 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606447 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:00.609283 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606455 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:00.609283 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606461 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:00.609283 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606467 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:00.609283 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606474 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:00.609283 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606478 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:00.609283 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606483 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606487 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606491 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606495 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606499 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606503 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606508 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606512 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606517 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606521 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606525 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606530 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606534 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606538 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606543 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606547 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606553 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606558 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606562 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:00.609679 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606567 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606571 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606575 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606579 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606583 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606587 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606591 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606595 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606600 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606604 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606608 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606613 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606617 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606621 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606625 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606629 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606633 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606638 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606642 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606646 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:00.610287 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606650 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606655 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606659 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606663 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606667 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606671 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606676 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606681 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606685 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606690 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606696 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606700 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606704 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606708 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606712 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606716 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606720 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606724 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606728 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606733 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606737 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:00.610769 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606741 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606745 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606749 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606753 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606757 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606761 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606765 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606769 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606773 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606777 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606781 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606787 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606792 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606797 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606801 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606805 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606809 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606814 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606818 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:00.611356 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:00.606822 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:00.611834 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.606831 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:30:00.611834 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.607014 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:30:00.611834 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.609859 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:30:00.611834 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.610829 2570 server.go:1019] "Starting client certificate rotation" Apr 16 18:30:00.611834 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.610939 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:30:00.611834 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.610975 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:30:00.634384 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.634363 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:30:00.636810 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.636778 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:30:00.652992 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.652970 2570 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:30:00.657588 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.657567 2570 log.go:25] "Validated CRI v1 image API" Apr 16 18:30:00.658965 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.658947 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:30:00.663134 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.663115 2570 fs.go:135] Filesystem UUIDs: map[145f5eca-e93d-4a21-ace3-06a322d398c8:/dev/nvme0n1p4 4d81b311-e372-4265-97b4-6d9cb3a16769:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 18:30:00.663204 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.663134 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:30:00.666629 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.666608 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:30:00.669310 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.669206 2570 manager.go:217] Machine: {Timestamp:2026-04-16 18:30:00.668068158 +0000 UTC m=+0.385925412 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101370 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec22206e1ee8305e54e677173cfa9934 SystemUUID:ec22206e-1ee8-305e-54e6-77173cfa9934 BootID:0cec42f1-3718-4301-bab1-767eefc8f174 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9c:fe:23:ca:2f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9c:fe:23:ca:2f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6e:18:b2:38:9b:48 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:30:00.669310 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.669306 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:30:00.669427 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.669393 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:30:00.670256 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.670235 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:30:00.670394 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.670259 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-166.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:30:00.670441 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.670404 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:30:00.670441 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.670412 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:30:00.670441 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.670424 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:30:00.671118 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.671108 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:30:00.672588 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.672578 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:30:00.672693 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.672685 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:30:00.674688 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.674677 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:30:00.674724 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.674698 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:30:00.674724 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.674710 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:30:00.674724 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.674719 2570 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:30:00.674811 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.674730 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:30:00.675914 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.675900 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:30:00.675964 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.675943 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:30:00.678861 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.678844 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:30:00.680655 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.680642 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:30:00.682085 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.682070 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:30:00.682135 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.682095 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:30:00.682135 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.682105 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:30:00.682135 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.682115 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:30:00.682135 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.682122 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:30:00.682135 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.682131 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:30:00.682269 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.682138 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:30:00.682269 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.682148 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:30:00.682269 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.682156 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:30:00.682269 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.682162 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:30:00.682269 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.682172 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:30:00.682269 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.682181 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:30:00.683013 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.683001 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:30:00.683065 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.683016 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:30:00.684669 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:00.684640 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:30:00.684747 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:00.684675 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:30:00.686128 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.686111 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-p28v2" Apr 16 18:30:00.686847 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.686835 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:30:00.686896 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.686888 2570 server.go:1295] "Started kubelet" Apr 16 18:30:00.687646 ip-10-0-129-166 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:30:00.687803 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.687634 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:30:00.687803 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.687724 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:30:00.688655 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.688590 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:30:00.688751 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.688738 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:30:00.689605 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.689590 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:30:00.689690 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.689655 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:30:00.694124 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.693586 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:30:00.694262 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.694130 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:30:00.694392 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.694374 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-p28v2" Apr 16 18:30:00.694993 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:00.694962 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-166.ec2.internal\" not found" Apr 16 18:30:00.695222 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.695204 2570 factory.go:55] Registering systemd factory Apr 16 18:30:00.695304 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.695247 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:30:00.695304 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.695261 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:30:00.695408 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.695359 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:30:00.695458 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.695414 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:30:00.695458 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.695450 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:30:00.695941 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:00.695885 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:30:00.696037 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.695966 2570 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:30:00.696199 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.696180 2570 factory.go:153] Registering CRI-O factory Apr 16 18:30:00.696199 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.696202 2570 factory.go:223] Registration of the crio container factory successfully Apr 16 18:30:00.696578 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.696559 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:30:00.697689 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:00.697663 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 18:30:00.698116 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:00.698093 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:30:00.698467 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.698447 2570 factory.go:103] Registering Raw factory Apr 16 18:30:00.698556 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.698476 2570 manager.go:1196] Started watching for new ooms in manager Apr 16 18:30:00.699612 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.699222 2570 manager.go:319] Starting recovery of all containers Apr 16 18:30:00.702483 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:00.697909 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-166.ec2.internal.18a6e9d435a907e8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-166.ec2.internal,UID:ip-10-0-129-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-166.ec2.internal,},FirstTimestamp:2026-04-16 18:30:00.686847976 +0000 UTC m=+0.404705229,LastTimestamp:2026-04-16 18:30:00.686847976 +0000 UTC m=+0.404705229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-166.ec2.internal,}" Apr 16 18:30:00.709977 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.709793 2570 manager.go:324] Recovery completed Apr 16 18:30:00.714038 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.714026 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:00.719233 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.719214 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:00.719303 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.719246 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:00.719303 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.719257 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:00.719779 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.719761 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:30:00.719779 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.719776 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:30:00.719870 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.719791 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:30:00.722004 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.721991 2570 policy_none.go:49] "None policy: Start" Apr 16 18:30:00.722062 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.722008 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:30:00.722062 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.722038 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:30:00.762592 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.762577 2570 manager.go:341] "Starting Device Plugin manager" Apr 16 18:30:00.770261 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:00.762611 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:30:00.770261 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.762622 2570 server.go:85] "Starting device plugin registration server" Apr 16 18:30:00.770261 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.762881 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:30:00.770261 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.762892 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:30:00.770261 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.763059 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:30:00.770261 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.763133 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:30:00.770261 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.763142 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:30:00.770261 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:00.763567 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:30:00.770261 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:00.763606 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-166.ec2.internal\" not found" Apr 16 18:30:00.770261 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.769078 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:30:00.770261 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.770264 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:30:00.770689 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.770288 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:30:00.770689 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.770304 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:30:00.770689 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.770310 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:30:00.770689 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:00.770340 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:30:00.774825 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.774809 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:00.864069 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.863958 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:00.864984 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.864964 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:00.865121 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.865001 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:00.865121 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.865016 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:00.865121 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.865046 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-166.ec2.internal" Apr 16 18:30:00.871128 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.871110 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-166.ec2.internal"] Apr 16 18:30:00.871195 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.871182 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:00.871973 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.871959 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:00.872050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.871986 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:00.872050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.872000 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:00.873214 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.873198 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-166.ec2.internal" Apr 16 18:30:00.873304 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:00.873222 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-166.ec2.internal\": node \"ip-10-0-129-166.ec2.internal\" not found" Apr 16 18:30:00.873304 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.873260 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:00.873409 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.873395 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal" Apr 16 18:30:00.873446 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.873423 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:00.874319 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.874280 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:00.874404 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.874328 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:00.874404 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.874343 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:00.874404 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.874301 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:00.874404 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.874399 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:00.874549 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.874409 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:00.875914 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.875901 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-166.ec2.internal" Apr 16 18:30:00.875967 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.875942 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:00.876608 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.876592 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:00.876683 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.876620 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:00.876683 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.876635 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:00.889844 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:00.889819 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-166.ec2.internal\" not found" Apr 16 18:30:00.907633 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:00.907616 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-166.ec2.internal\" not found" node="ip-10-0-129-166.ec2.internal" Apr 16 18:30:00.912036 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:00.912021 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-166.ec2.internal\" not found" node="ip-10-0-129-166.ec2.internal" Apr 16 18:30:00.990808 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:00.990777 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-166.ec2.internal\" not found" Apr 16 18:30:00.997187 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.997155 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/33b16aa1b3125d9e0d75367d83187958-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal\" (UID: \"33b16aa1b3125d9e0d75367d83187958\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal" Apr 16 18:30:00.997279 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.997202 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33b16aa1b3125d9e0d75367d83187958-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal\" (UID: \"33b16aa1b3125d9e0d75367d83187958\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal" Apr 16 18:30:00.997279 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:00.997226 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ca19b534e720fdd7bc90ad9dfbc6cf32-config\") pod \"kube-apiserver-proxy-ip-10-0-129-166.ec2.internal\" (UID: \"ca19b534e720fdd7bc90ad9dfbc6cf32\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-166.ec2.internal" Apr 16 18:30:01.091843 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:01.091810 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-166.ec2.internal\" not found" Apr 16 18:30:01.098267 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.098246 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ca19b534e720fdd7bc90ad9dfbc6cf32-config\") pod \"kube-apiserver-proxy-ip-10-0-129-166.ec2.internal\" (UID: \"ca19b534e720fdd7bc90ad9dfbc6cf32\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-166.ec2.internal" Apr 16 18:30:01.098339 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.098267 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ca19b534e720fdd7bc90ad9dfbc6cf32-config\") pod \"kube-apiserver-proxy-ip-10-0-129-166.ec2.internal\" (UID: \"ca19b534e720fdd7bc90ad9dfbc6cf32\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-166.ec2.internal" Apr 16 18:30:01.098339 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.098290 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/33b16aa1b3125d9e0d75367d83187958-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal\" (UID: \"33b16aa1b3125d9e0d75367d83187958\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal" Apr 16 18:30:01.098339 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.098327 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33b16aa1b3125d9e0d75367d83187958-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal\" (UID: \"33b16aa1b3125d9e0d75367d83187958\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal" Apr 16 18:30:01.098436 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.098352 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/33b16aa1b3125d9e0d75367d83187958-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal\" (UID: \"33b16aa1b3125d9e0d75367d83187958\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal" Apr 16 18:30:01.098436 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.098373 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33b16aa1b3125d9e0d75367d83187958-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal\" (UID: \"33b16aa1b3125d9e0d75367d83187958\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal" Apr 16 18:30:01.192709 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:01.192635 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-166.ec2.internal\" not found" Apr 16 18:30:01.211223 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.211193 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal" Apr 16 18:30:01.214863 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.214763 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-166.ec2.internal" Apr 16 18:30:01.293722 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:01.293684 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-166.ec2.internal\" not found" Apr 16 18:30:01.394258 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:01.394216 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-166.ec2.internal\" not found" Apr 16 18:30:01.494788 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:01.494719 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-166.ec2.internal\" not found" Apr 16 18:30:01.595309 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:01.595279 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-166.ec2.internal\" not found" Apr 16 18:30:01.610672 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.610652 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:30:01.610807 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.610791 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:30:01.693385 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.693357 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:01.694392 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.694380 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:30:01.695363 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:01.695347 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-166.ec2.internal\" not found" Apr 16 18:30:01.695976 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.695956 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:25:00 +0000 UTC" deadline="2027-12-07 14:53:40.916927452 +0000 UTC" Apr 16 18:30:01.696038 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.695978 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14396h23m39.220953058s" Apr 16 18:30:01.706886 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.706863 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:30:01.727585 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.727567 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:01.733042 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.733024 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-bmsqh" Apr 16 18:30:01.741385 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.741365 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-bmsqh" Apr 16 18:30:01.794975 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.794895 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal" Apr 16 18:30:01.807954 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.807915 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:30:01.809998 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.809983 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-166.ec2.internal" Apr 16 18:30:01.819862 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.819843 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:30:01.923708 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:01.923677 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca19b534e720fdd7bc90ad9dfbc6cf32.slice/crio-083d6e3c4d5d149fabdad5a632f5605521b054a4431564721ea5b895d1d33acc WatchSource:0}: Error finding container 083d6e3c4d5d149fabdad5a632f5605521b054a4431564721ea5b895d1d33acc: Status 404 returned error can't find the container with id 083d6e3c4d5d149fabdad5a632f5605521b054a4431564721ea5b895d1d33acc Apr 16 18:30:01.924124 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:01.924100 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33b16aa1b3125d9e0d75367d83187958.slice/crio-34565c16ae52a7a69f5e4ecc88539adeee9b6f050cc27522371814bbed18fc4c WatchSource:0}: Error finding container 34565c16ae52a7a69f5e4ecc88539adeee9b6f050cc27522371814bbed18fc4c: Status 404 returned error can't find the container with id 34565c16ae52a7a69f5e4ecc88539adeee9b6f050cc27522371814bbed18fc4c Apr 16 18:30:01.927534 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:01.927518 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:30:02.153992 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.153966 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:02.675378 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.675344 2570 apiserver.go:52] "Watching apiserver" Apr 16 18:30:02.684873 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.684845 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:30:02.686747 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.686719 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8t42k","kube-system/konnectivity-agent-dqknm","kube-system/kube-apiserver-proxy-ip-10-0-129-166.ec2.internal","openshift-image-registry/node-ca-2z4jk","openshift-multus/multus-additional-cni-plugins-9zzzv","openshift-multus/network-metrics-daemon-kc2vf","openshift-network-operator/iptables-alerter-mdxtz","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k","openshift-cluster-node-tuning-operator/tuned-dwhd9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal","openshift-multus/multus-sqh7m","openshift-network-diagnostics/network-check-target-vdcwk"] Apr 16 18:30:02.688825 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.688797 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.690062 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.690039 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dqknm" Apr 16 18:30:02.691369 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.691349 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2z4jk" Apr 16 18:30:02.693314 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.693294 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:30:02.693314 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.693314 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:30:02.693545 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.693529 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qq9d8\"" Apr 16 18:30:02.693612 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.693598 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.693685 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.693671 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:30:02.693764 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.693748 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:30:02.693813 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.693680 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:30:02.693858 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.693535 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:30:02.693898 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.693597 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-947w9\"" Apr 16 18:30:02.694216 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.694133 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:30:02.694321 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.694242 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:30:02.695756 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.695732 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:02.695845 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:02.695815 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kc2vf" podUID="1d7d2281-07bb-4906-844c-f53fbfe57143" Apr 16 18:30:02.698105 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.697965 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.698848 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.698789 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:30:02.699280 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.699262 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:30:02.699941 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.699402 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:30:02.699941 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.699594 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:30:02.699941 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.699832 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-s6bzs\"" Apr 16 18:30:02.699941 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.699893 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.700680 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.700207 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:30:02.700680 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.700474 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:30:02.700680 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.700666 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-xn4bx\"" Apr 16 18:30:02.701204 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.701185 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:30:02.701978 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.701959 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:30:02.702211 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.702189 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:30:02.702378 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.702363 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.702502 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.702484 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:02.702567 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:02.702547 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdcwk" podUID="f3ade13f-7d4c-4574-bfc3-a946ccc0dd37" Apr 16 18:30:02.702567 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.702560 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-sdrgb\"" Apr 16 18:30:02.702678 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.702657 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:30:02.704259 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.703851 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mdxtz" Apr 16 18:30:02.704526 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.704506 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:30:02.704901 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.704876 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xkbdg\"" Apr 16 18:30:02.705382 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.705364 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:30:02.705513 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.705387 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qz94d\"" Apr 16 18:30:02.705583 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.705413 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:30:02.705631 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.705465 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:30:02.706314 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706294 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bfdfe9c3-61ab-4681-93fd-547837fc60cf-registration-dir\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.706438 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706421 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:30:02.706521 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706506 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-run-ovn-kubernetes\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.706577 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706550 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e4fd3994-9933-4354-aff4-2baed763eb94-system-cni-dir\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.706630 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706586 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs\") pod \"network-metrics-daemon-kc2vf\" (UID: \"1d7d2281-07bb-4906-844c-f53fbfe57143\") " pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:02.706630 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706614 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-systemd-units\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.706733 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706638 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-etc-modprobe-d\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.706733 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706666 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e4fd3994-9933-4354-aff4-2baed763eb94-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.706733 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706689 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/42dd4370-c71a-4351-88f5-8f1f146f3846-agent-certs\") pod \"konnectivity-agent-dqknm\" (UID: \"42dd4370-c71a-4351-88f5-8f1f146f3846\") " pod="kube-system/konnectivity-agent-dqknm" Apr 16 18:30:02.706733 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706701 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:30:02.706733 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706712 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-etc-systemd\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.706984 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706736 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e4fd3994-9933-4354-aff4-2baed763eb94-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.706984 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706762 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-log-socket\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.706984 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706784 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/42dd4370-c71a-4351-88f5-8f1f146f3846-konnectivity-ca\") pod \"konnectivity-agent-dqknm\" (UID: \"42dd4370-c71a-4351-88f5-8f1f146f3846\") " pod="kube-system/konnectivity-agent-dqknm" Apr 16 18:30:02.706984 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706807 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-var-lib-kubelet\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.706984 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706829 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:30:02.706984 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706847 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/93747e20-acab-493c-8520-f3b549e0c240-etc-tuned\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.706984 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706870 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e4fd3994-9933-4354-aff4-2baed763eb94-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.706984 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706895 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg7h6\" (UniqueName: \"kubernetes.io/projected/e4fd3994-9933-4354-aff4-2baed763eb94-kube-api-access-tg7h6\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.706984 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706936 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhm28\" (UniqueName: \"kubernetes.io/projected/bfdfe9c3-61ab-4681-93fd-547837fc60cf-kube-api-access-mhm28\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.706984 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.706962 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-run-netns\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.707434 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707002 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-run-ovn\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.707434 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707027 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-sys\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.707434 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707146 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-host\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.707434 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707187 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e4fd3994-9933-4354-aff4-2baed763eb94-cnibin\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.707434 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707220 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e4fd3994-9933-4354-aff4-2baed763eb94-cni-binary-copy\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.707434 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707268 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-var-lib-openvswitch\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.707434 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707295 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-etc-sysctl-conf\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.707434 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707333 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bfdfe9c3-61ab-4681-93fd-547837fc60cf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.707434 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707357 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bfdfe9c3-61ab-4681-93fd-547837fc60cf-socket-dir\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.707434 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707381 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bfdfe9c3-61ab-4681-93fd-547837fc60cf-etc-selinux\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.707434 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707393 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-wg4gm\"" Apr 16 18:30:02.707434 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707402 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwp8q\" (UniqueName: \"kubernetes.io/projected/9521e1df-4c34-4a19-bce1-983c6712cca8-kube-api-access-qwp8q\") pod \"node-ca-2z4jk\" (UID: \"9521e1df-4c34-4a19-bce1-983c6712cca8\") " pod="openshift-image-registry/node-ca-2z4jk" Apr 16 18:30:02.707434 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707432 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-etc-sysconfig\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.708050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707454 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-run\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.708050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707484 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9521e1df-4c34-4a19-bce1-983c6712cca8-host\") pod \"node-ca-2z4jk\" (UID: \"9521e1df-4c34-4a19-bce1-983c6712cca8\") " pod="openshift-image-registry/node-ca-2z4jk" Apr 16 18:30:02.708050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707506 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-lib-modules\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.708050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707531 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e4fd3994-9933-4354-aff4-2baed763eb94-os-release\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.708050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707555 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-run-openvswitch\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.708050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707579 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9b204315-a289-4466-91fd-0714100a1752-ovnkube-config\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.708050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707602 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9b204315-a289-4466-91fd-0714100a1752-env-overrides\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.708050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707624 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-etc-kubernetes\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.708050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707672 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-etc-openvswitch\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.708050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707708 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-cni-netd\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.708050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707741 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9b204315-a289-4466-91fd-0714100a1752-ovnkube-script-lib\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.708050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707774 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/93747e20-acab-493c-8520-f3b549e0c240-tmp\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.708050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707806 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-node-log\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.708050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707835 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9b204315-a289-4466-91fd-0714100a1752-ovn-node-metrics-cert\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.708050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707860 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bfdfe9c3-61ab-4681-93fd-547837fc60cf-device-dir\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.708050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707885 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4klct\" (UniqueName: \"kubernetes.io/projected/93747e20-acab-493c-8520-f3b549e0c240-kube-api-access-4klct\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.708050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707906 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9521e1df-4c34-4a19-bce1-983c6712cca8-serviceca\") pod \"node-ca-2z4jk\" (UID: \"9521e1df-4c34-4a19-bce1-983c6712cca8\") " pod="openshift-image-registry/node-ca-2z4jk" Apr 16 18:30:02.708782 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707949 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-slash\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.708782 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.707984 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-run-systemd\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.708782 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.708039 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb7km\" (UniqueName: \"kubernetes.io/projected/9b204315-a289-4466-91fd-0714100a1752-kube-api-access-bb7km\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.708782 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.708064 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qkh2\" (UniqueName: \"kubernetes.io/projected/1d7d2281-07bb-4906-844c-f53fbfe57143-kube-api-access-5qkh2\") pod \"network-metrics-daemon-kc2vf\" (UID: \"1d7d2281-07bb-4906-844c-f53fbfe57143\") " pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:02.708782 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.708089 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bfdfe9c3-61ab-4681-93fd-547837fc60cf-sys-fs\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.708782 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.708113 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-cni-bin\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.708782 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.708137 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-etc-sysctl-d\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.708782 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.708162 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-kubelet\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.708782 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.708199 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.743957 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.743899 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:25:01 +0000 UTC" deadline="2028-02-02 16:26:26.63077848 +0000 UTC" Apr 16 18:30:02.743957 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.743944 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15765h56m23.886838697s" Apr 16 18:30:02.774980 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.774928 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal" event={"ID":"33b16aa1b3125d9e0d75367d83187958","Type":"ContainerStarted","Data":"34565c16ae52a7a69f5e4ecc88539adeee9b6f050cc27522371814bbed18fc4c"} Apr 16 18:30:02.776499 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.776472 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-166.ec2.internal" event={"ID":"ca19b534e720fdd7bc90ad9dfbc6cf32","Type":"ContainerStarted","Data":"083d6e3c4d5d149fabdad5a632f5605521b054a4431564721ea5b895d1d33acc"} Apr 16 18:30:02.796883 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.796829 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:30:02.809391 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.809359 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e4fd3994-9933-4354-aff4-2baed763eb94-os-release\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.809391 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.809401 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-run-openvswitch\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.809622 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.809428 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9b204315-a289-4466-91fd-0714100a1752-ovnkube-config\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.809622 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.809486 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-run-openvswitch\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.809622 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.809514 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e4fd3994-9933-4354-aff4-2baed763eb94-os-release\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.809622 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.809549 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9b204315-a289-4466-91fd-0714100a1752-env-overrides\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.809737 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.809619 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-etc-kubernetes\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.809737 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.809666 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-host-run-k8s-cni-cncf-io\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.809737 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.809707 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clw2p\" (UniqueName: \"kubernetes.io/projected/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-kube-api-access-clw2p\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.809858 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.809750 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-etc-kubernetes\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.809858 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.809764 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-etc-openvswitch\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.809858 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.809797 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-cni-netd\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.809858 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.809822 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9b204315-a289-4466-91fd-0714100a1752-ovnkube-script-lib\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.809858 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.809828 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-etc-openvswitch\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.810084 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.809859 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-cni-netd\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.810084 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.809846 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/93747e20-acab-493c-8520-f3b549e0c240-tmp\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.810084 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.809940 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-host-var-lib-cni-bin\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.810084 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810009 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-host-var-lib-cni-multus\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.810084 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810036 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-node-log\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.810084 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810059 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9b204315-a289-4466-91fd-0714100a1752-ovn-node-metrics-cert\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.810084 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810065 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9b204315-a289-4466-91fd-0714100a1752-ovnkube-config\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.810084 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810082 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-system-cni-dir\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.810394 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810105 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-cnibin\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.810394 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810126 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-multus-conf-dir\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.810394 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810150 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-node-log\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.810394 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810154 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:30:02.810394 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810149 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z89tb\" (UniqueName: \"kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb\") pod \"network-check-target-vdcwk\" (UID: \"f3ade13f-7d4c-4574-bfc3-a946ccc0dd37\") " pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:02.810394 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810211 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bfdfe9c3-61ab-4681-93fd-547837fc60cf-device-dir\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.810394 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810234 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4klct\" (UniqueName: \"kubernetes.io/projected/93747e20-acab-493c-8520-f3b549e0c240-kube-api-access-4klct\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.810394 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810267 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9521e1df-4c34-4a19-bce1-983c6712cca8-serviceca\") pod \"node-ca-2z4jk\" (UID: \"9521e1df-4c34-4a19-bce1-983c6712cca8\") " pod="openshift-image-registry/node-ca-2z4jk" Apr 16 18:30:02.810394 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810292 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-multus-cni-dir\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.810394 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810319 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-slash\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.810394 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810341 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-run-systemd\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.810394 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810367 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bb7km\" (UniqueName: \"kubernetes.io/projected/9b204315-a289-4466-91fd-0714100a1752-kube-api-access-bb7km\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.811031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810408 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-multus-socket-dir-parent\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.811031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810436 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qkh2\" (UniqueName: \"kubernetes.io/projected/1d7d2281-07bb-4906-844c-f53fbfe57143-kube-api-access-5qkh2\") pod \"network-metrics-daemon-kc2vf\" (UID: \"1d7d2281-07bb-4906-844c-f53fbfe57143\") " pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:02.811031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810458 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bfdfe9c3-61ab-4681-93fd-547837fc60cf-sys-fs\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.811031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810481 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-cni-bin\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.811031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810502 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-etc-sysctl-d\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.811031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810528 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-host-run-netns\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.811031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810551 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-hostroot\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.811031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810576 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-kubelet\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.811031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810582 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-run-systemd\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.811031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810600 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.811031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810629 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-cni-binary-copy\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.811031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810651 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bfdfe9c3-61ab-4681-93fd-547837fc60cf-device-dir\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.811031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810660 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bfdfe9c3-61ab-4681-93fd-547837fc60cf-registration-dir\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.811031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810686 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-run-ovn-kubernetes\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.811031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810711 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-host-var-lib-kubelet\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.811031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810740 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e4fd3994-9933-4354-aff4-2baed763eb94-system-cni-dir\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.811031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810773 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs\") pod \"network-metrics-daemon-kc2vf\" (UID: \"1d7d2281-07bb-4906-844c-f53fbfe57143\") " pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:02.811735 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810801 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-systemd-units\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.811735 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810814 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bfdfe9c3-61ab-4681-93fd-547837fc60cf-sys-fs\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.811735 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810825 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-etc-modprobe-d\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.811735 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810856 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e4fd3994-9933-4354-aff4-2baed763eb94-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.811735 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810864 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-cni-bin\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.811735 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810881 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/42dd4370-c71a-4351-88f5-8f1f146f3846-agent-certs\") pod \"konnectivity-agent-dqknm\" (UID: \"42dd4370-c71a-4351-88f5-8f1f146f3846\") " pod="kube-system/konnectivity-agent-dqknm" Apr 16 18:30:02.811735 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810905 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-etc-systemd\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.811735 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810950 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e4fd3994-9933-4354-aff4-2baed763eb94-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.811735 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.810976 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-log-socket\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.811735 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811000 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/42dd4370-c71a-4351-88f5-8f1f146f3846-konnectivity-ca\") pod \"konnectivity-agent-dqknm\" (UID: \"42dd4370-c71a-4351-88f5-8f1f146f3846\") " pod="kube-system/konnectivity-agent-dqknm" Apr 16 18:30:02.811735 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811006 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-etc-sysctl-d\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.811735 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811023 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-var-lib-kubelet\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.811735 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811048 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/93747e20-acab-493c-8520-f3b549e0c240-etc-tuned\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.811735 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811056 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-slash\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.811735 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811072 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e4fd3994-9933-4354-aff4-2baed763eb94-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.811735 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811097 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tg7h6\" (UniqueName: \"kubernetes.io/projected/e4fd3994-9933-4354-aff4-2baed763eb94-kube-api-access-tg7h6\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.811735 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811106 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-systemd-units\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.812555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811117 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhm28\" (UniqueName: \"kubernetes.io/projected/bfdfe9c3-61ab-4681-93fd-547837fc60cf-kube-api-access-mhm28\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.812555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811137 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-run-netns\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.812555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811158 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-kubelet\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.812555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811178 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-run-ovn\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.812555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811199 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-sys\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.812555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811200 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.812555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811218 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-host\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.812555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811250 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e4fd3994-9933-4354-aff4-2baed763eb94-cnibin\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.812555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811251 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9521e1df-4c34-4a19-bce1-983c6712cca8-serviceca\") pod \"node-ca-2z4jk\" (UID: \"9521e1df-4c34-4a19-bce1-983c6712cca8\") " pod="openshift-image-registry/node-ca-2z4jk" Apr 16 18:30:02.812555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811387 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-etc-modprobe-d\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.812555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811421 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9b204315-a289-4466-91fd-0714100a1752-env-overrides\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.812555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811483 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bfdfe9c3-61ab-4681-93fd-547837fc60cf-registration-dir\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.812555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811497 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-run-ovn\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.812555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811517 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-log-socket\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.812555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811553 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-sys\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.812555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811594 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-host\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.812555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811626 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e4fd3994-9933-4354-aff4-2baed763eb94-cnibin\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.812555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811670 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-etc-systemd\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.813422 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811868 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e4fd3994-9933-4354-aff4-2baed763eb94-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.813422 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811902 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e4fd3994-9933-4354-aff4-2baed763eb94-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.813422 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811933 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e4fd3994-9933-4354-aff4-2baed763eb94-system-cni-dir\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.813422 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811969 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e4fd3994-9933-4354-aff4-2baed763eb94-cni-binary-copy\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.813422 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811986 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-run-netns\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.813422 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.811997 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-var-lib-openvswitch\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.813422 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812027 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-host-run-ovn-kubernetes\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.813422 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812046 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/42dd4370-c71a-4351-88f5-8f1f146f3846-konnectivity-ca\") pod \"konnectivity-agent-dqknm\" (UID: \"42dd4370-c71a-4351-88f5-8f1f146f3846\") " pod="kube-system/konnectivity-agent-dqknm" Apr 16 18:30:02.813422 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812070 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-etc-sysctl-conf\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.813422 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812098 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-multus-daemon-config\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.813422 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812128 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d528\" (UniqueName: \"kubernetes.io/projected/d3f2c90c-d5bc-430b-8c49-774156034361-kube-api-access-9d528\") pod \"iptables-alerter-mdxtz\" (UID: \"d3f2c90c-d5bc-430b-8c49-774156034361\") " pod="openshift-network-operator/iptables-alerter-mdxtz" Apr 16 18:30:02.813422 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812159 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bfdfe9c3-61ab-4681-93fd-547837fc60cf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.813422 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812185 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bfdfe9c3-61ab-4681-93fd-547837fc60cf-socket-dir\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.813422 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812231 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bfdfe9c3-61ab-4681-93fd-547837fc60cf-etc-selinux\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.813422 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812259 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwp8q\" (UniqueName: \"kubernetes.io/projected/9521e1df-4c34-4a19-bce1-983c6712cca8-kube-api-access-qwp8q\") pod \"node-ca-2z4jk\" (UID: \"9521e1df-4c34-4a19-bce1-983c6712cca8\") " pod="openshift-image-registry/node-ca-2z4jk" Apr 16 18:30:02.813422 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812285 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-etc-kubernetes\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.813422 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812313 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-etc-sysconfig\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.814217 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812338 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-run\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.814217 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812361 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9521e1df-4c34-4a19-bce1-983c6712cca8-host\") pod \"node-ca-2z4jk\" (UID: \"9521e1df-4c34-4a19-bce1-983c6712cca8\") " pod="openshift-image-registry/node-ca-2z4jk" Apr 16 18:30:02.814217 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812375 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9b204315-a289-4466-91fd-0714100a1752-ovnkube-script-lib\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.814217 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812401 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d3f2c90c-d5bc-430b-8c49-774156034361-iptables-alerter-script\") pod \"iptables-alerter-mdxtz\" (UID: \"d3f2c90c-d5bc-430b-8c49-774156034361\") " pod="openshift-network-operator/iptables-alerter-mdxtz" Apr 16 18:30:02.814217 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812428 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d3f2c90c-d5bc-430b-8c49-774156034361-host-slash\") pod \"iptables-alerter-mdxtz\" (UID: \"d3f2c90c-d5bc-430b-8c49-774156034361\") " pod="openshift-network-operator/iptables-alerter-mdxtz" Apr 16 18:30:02.814217 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812440 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b204315-a289-4466-91fd-0714100a1752-var-lib-openvswitch\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.814217 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812456 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-lib-modules\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.814217 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812502 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e4fd3994-9933-4354-aff4-2baed763eb94-cni-binary-copy\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.814217 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812552 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-os-release\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.814217 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812583 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-host-run-multus-certs\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.814217 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812704 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bfdfe9c3-61ab-4681-93fd-547837fc60cf-etc-selinux\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.814217 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812707 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-run\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.814217 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812795 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bfdfe9c3-61ab-4681-93fd-547837fc60cf-socket-dir\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.814217 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812801 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-lib-modules\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.814217 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-etc-sysconfig\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.814217 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812843 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bfdfe9c3-61ab-4681-93fd-547837fc60cf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.814217 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812853 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-var-lib-kubelet\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.815007 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812902 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/93747e20-acab-493c-8520-f3b549e0c240-etc-sysctl-conf\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.815007 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.812908 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9521e1df-4c34-4a19-bce1-983c6712cca8-host\") pod \"node-ca-2z4jk\" (UID: \"9521e1df-4c34-4a19-bce1-983c6712cca8\") " pod="openshift-image-registry/node-ca-2z4jk" Apr 16 18:30:02.815007 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:02.812939 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:02.815007 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:02.813030 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs podName:1d7d2281-07bb-4906-844c-f53fbfe57143 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:03.313000934 +0000 UTC m=+3.030858177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs") pod "network-metrics-daemon-kc2vf" (UID: "1d7d2281-07bb-4906-844c-f53fbfe57143") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:02.815700 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.815677 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/93747e20-acab-493c-8520-f3b549e0c240-tmp\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.816555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.816530 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9b204315-a289-4466-91fd-0714100a1752-ovn-node-metrics-cert\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.816661 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.816641 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/93747e20-acab-493c-8520-f3b549e0c240-etc-tuned\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.817600 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.817576 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/42dd4370-c71a-4351-88f5-8f1f146f3846-agent-certs\") pod \"konnectivity-agent-dqknm\" (UID: \"42dd4370-c71a-4351-88f5-8f1f146f3846\") " pod="kube-system/konnectivity-agent-dqknm" Apr 16 18:30:02.820554 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.820155 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qkh2\" (UniqueName: \"kubernetes.io/projected/1d7d2281-07bb-4906-844c-f53fbfe57143-kube-api-access-5qkh2\") pod \"network-metrics-daemon-kc2vf\" (UID: \"1d7d2281-07bb-4906-844c-f53fbfe57143\") " pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:02.820907 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.820882 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb7km\" (UniqueName: \"kubernetes.io/projected/9b204315-a289-4466-91fd-0714100a1752-kube-api-access-bb7km\") pod \"ovnkube-node-8t42k\" (UID: \"9b204315-a289-4466-91fd-0714100a1752\") " pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:02.821689 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.821668 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4klct\" (UniqueName: \"kubernetes.io/projected/93747e20-acab-493c-8520-f3b549e0c240-kube-api-access-4klct\") pod \"tuned-dwhd9\" (UID: \"93747e20-acab-493c-8520-f3b549e0c240\") " pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:02.822085 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.822066 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e4fd3994-9933-4354-aff4-2baed763eb94-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.823303 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.823275 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg7h6\" (UniqueName: \"kubernetes.io/projected/e4fd3994-9933-4354-aff4-2baed763eb94-kube-api-access-tg7h6\") pod \"multus-additional-cni-plugins-9zzzv\" (UID: \"e4fd3994-9933-4354-aff4-2baed763eb94\") " pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:02.823896 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.823875 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwp8q\" (UniqueName: \"kubernetes.io/projected/9521e1df-4c34-4a19-bce1-983c6712cca8-kube-api-access-qwp8q\") pod \"node-ca-2z4jk\" (UID: \"9521e1df-4c34-4a19-bce1-983c6712cca8\") " pod="openshift-image-registry/node-ca-2z4jk" Apr 16 18:30:02.825516 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.825490 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhm28\" (UniqueName: \"kubernetes.io/projected/bfdfe9c3-61ab-4681-93fd-547837fc60cf-kube-api-access-mhm28\") pod \"aws-ebs-csi-driver-node-fp65k\" (UID: \"bfdfe9c3-61ab-4681-93fd-547837fc60cf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:02.913585 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.913550 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d3f2c90c-d5bc-430b-8c49-774156034361-host-slash\") pod \"iptables-alerter-mdxtz\" (UID: \"d3f2c90c-d5bc-430b-8c49-774156034361\") " pod="openshift-network-operator/iptables-alerter-mdxtz" Apr 16 18:30:02.913764 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.913594 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-os-release\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.913764 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.913635 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-host-run-multus-certs\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.913764 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.913663 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-host-run-k8s-cni-cncf-io\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.913764 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.913678 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d3f2c90c-d5bc-430b-8c49-774156034361-host-slash\") pod \"iptables-alerter-mdxtz\" (UID: \"d3f2c90c-d5bc-430b-8c49-774156034361\") " pod="openshift-network-operator/iptables-alerter-mdxtz" Apr 16 18:30:02.913764 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.913685 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clw2p\" (UniqueName: \"kubernetes.io/projected/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-kube-api-access-clw2p\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.913764 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.913741 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-host-var-lib-cni-bin\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.913764 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.913765 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-host-var-lib-cni-multus\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914125 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.913793 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-system-cni-dir\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914125 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.913821 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-cnibin\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914125 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.913844 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-multus-conf-dir\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914125 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.913873 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z89tb\" (UniqueName: \"kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb\") pod \"network-check-target-vdcwk\" (UID: \"f3ade13f-7d4c-4574-bfc3-a946ccc0dd37\") " pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:02.914125 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.913899 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-multus-cni-dir\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914125 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.913973 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-multus-socket-dir-parent\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914125 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.913985 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-os-release\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914125 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914004 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-host-run-multus-certs\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914125 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914023 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-host-run-k8s-cni-cncf-io\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914125 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914041 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-cnibin\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914125 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914066 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-host-var-lib-cni-bin\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914125 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914070 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-host-run-netns\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914125 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914067 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-multus-conf-dir\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914125 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914010 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-host-run-netns\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914125 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914115 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-host-var-lib-cni-multus\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914125 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914125 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-hostroot\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914868 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914148 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-multus-socket-dir-parent\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914868 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914155 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-cni-binary-copy\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914868 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914192 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-host-var-lib-kubelet\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914868 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914250 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-multus-daemon-config\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914868 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914266 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-multus-cni-dir\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914868 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914275 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9d528\" (UniqueName: \"kubernetes.io/projected/d3f2c90c-d5bc-430b-8c49-774156034361-kube-api-access-9d528\") pod \"iptables-alerter-mdxtz\" (UID: \"d3f2c90c-d5bc-430b-8c49-774156034361\") " pod="openshift-network-operator/iptables-alerter-mdxtz" Apr 16 18:30:02.914868 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914290 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-hostroot\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914868 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914153 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-system-cni-dir\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914868 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914325 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-etc-kubernetes\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914868 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914344 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d3f2c90c-d5bc-430b-8c49-774156034361-iptables-alerter-script\") pod \"iptables-alerter-mdxtz\" (UID: \"d3f2c90c-d5bc-430b-8c49-774156034361\") " pod="openshift-network-operator/iptables-alerter-mdxtz" Apr 16 18:30:02.914868 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914562 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-etc-kubernetes\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914868 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914624 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-host-var-lib-kubelet\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914868 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914699 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-cni-binary-copy\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.914868 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914795 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d3f2c90c-d5bc-430b-8c49-774156034361-iptables-alerter-script\") pod \"iptables-alerter-mdxtz\" (UID: \"d3f2c90c-d5bc-430b-8c49-774156034361\") " pod="openshift-network-operator/iptables-alerter-mdxtz" Apr 16 18:30:02.914868 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.914816 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-multus-daemon-config\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.921174 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:02.921031 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:02.921174 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:02.921067 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:02.921174 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:02.921081 2570 projected.go:194] Error preparing data for projected volume kube-api-access-z89tb for pod openshift-network-diagnostics/network-check-target-vdcwk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:02.921174 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:02.921145 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb podName:f3ade13f-7d4c-4574-bfc3-a946ccc0dd37 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:03.421125866 +0000 UTC m=+3.138983121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-z89tb" (UniqueName: "kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb") pod "network-check-target-vdcwk" (UID: "f3ade13f-7d4c-4574-bfc3-a946ccc0dd37") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:02.923491 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.923471 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clw2p\" (UniqueName: \"kubernetes.io/projected/ccad455d-b7e3-4ad9-b224-de9cedc28cb3-kube-api-access-clw2p\") pod \"multus-sqh7m\" (UID: \"ccad455d-b7e3-4ad9-b224-de9cedc28cb3\") " pod="openshift-multus/multus-sqh7m" Apr 16 18:30:02.925074 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:02.925052 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d528\" (UniqueName: \"kubernetes.io/projected/d3f2c90c-d5bc-430b-8c49-774156034361-kube-api-access-9d528\") pod \"iptables-alerter-mdxtz\" (UID: \"d3f2c90c-d5bc-430b-8c49-774156034361\") " pod="openshift-network-operator/iptables-alerter-mdxtz" Apr 16 18:30:03.001347 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.001266 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:03.013561 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.013532 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dqknm" Apr 16 18:30:03.022587 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.022557 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2z4jk" Apr 16 18:30:03.027238 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.027216 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9zzzv" Apr 16 18:30:03.028890 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.028871 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:03.035378 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.035361 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" Apr 16 18:30:03.042009 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.041990 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sqh7m" Apr 16 18:30:03.049613 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.049582 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" Apr 16 18:30:03.057164 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.057139 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mdxtz" Apr 16 18:30:03.316696 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.316606 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs\") pod \"network-metrics-daemon-kc2vf\" (UID: \"1d7d2281-07bb-4906-844c-f53fbfe57143\") " pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:03.316860 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:03.316752 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:03.316860 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:03.316825 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs podName:1d7d2281-07bb-4906-844c-f53fbfe57143 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:04.316805951 +0000 UTC m=+4.034663193 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs") pod "network-metrics-daemon-kc2vf" (UID: "1d7d2281-07bb-4906-844c-f53fbfe57143") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:03.517714 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.517664 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z89tb\" (UniqueName: \"kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb\") pod \"network-check-target-vdcwk\" (UID: \"f3ade13f-7d4c-4574-bfc3-a946ccc0dd37\") " pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:03.517873 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:03.517844 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:03.517873 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:03.517863 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:03.517873 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:03.517874 2570 projected.go:194] Error preparing data for projected volume kube-api-access-z89tb for pod openshift-network-diagnostics/network-check-target-vdcwk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:03.518031 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:03.517961 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb podName:f3ade13f-7d4c-4574-bfc3-a946ccc0dd37 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:04.517943861 +0000 UTC m=+4.235801120 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-z89tb" (UniqueName: "kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb") pod "network-check-target-vdcwk" (UID: "f3ade13f-7d4c-4574-bfc3-a946ccc0dd37") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:03.697208 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:03.697180 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccad455d_b7e3_4ad9_b224_de9cedc28cb3.slice/crio-209f4f20cf3866580ae5e1a661503a2b0a263fc33db862d0e63682afeb912229 WatchSource:0}: Error finding container 209f4f20cf3866580ae5e1a661503a2b0a263fc33db862d0e63682afeb912229: Status 404 returned error can't find the container with id 209f4f20cf3866580ae5e1a661503a2b0a263fc33db862d0e63682afeb912229 Apr 16 18:30:03.698539 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:03.698515 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42dd4370_c71a_4351_88f5_8f1f146f3846.slice/crio-85a4f9cace5cd2c65d9fcbeb942e70dba65ef4cc33b4567d0ac709b7b86cb3de WatchSource:0}: Error finding container 85a4f9cace5cd2c65d9fcbeb942e70dba65ef4cc33b4567d0ac709b7b86cb3de: Status 404 returned error can't find the container with id 85a4f9cace5cd2c65d9fcbeb942e70dba65ef4cc33b4567d0ac709b7b86cb3de Apr 16 18:30:03.699120 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:03.699091 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9521e1df_4c34_4a19_bce1_983c6712cca8.slice/crio-1ee8311906802ca84d4d988439deb0e77b8479dcf5ed65eb6089017a8ca45374 WatchSource:0}: Error finding container 1ee8311906802ca84d4d988439deb0e77b8479dcf5ed65eb6089017a8ca45374: Status 404 returned error can't find the container with id 1ee8311906802ca84d4d988439deb0e77b8479dcf5ed65eb6089017a8ca45374 Apr 16 18:30:03.702110 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:03.702084 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93747e20_acab_493c_8520_f3b549e0c240.slice/crio-cfdb594a076468eb52d062d1f1b04b99399365573d74feb8ad39d9a58b1a08aa WatchSource:0}: Error finding container cfdb594a076468eb52d062d1f1b04b99399365573d74feb8ad39d9a58b1a08aa: Status 404 returned error can't find the container with id cfdb594a076468eb52d062d1f1b04b99399365573d74feb8ad39d9a58b1a08aa Apr 16 18:30:03.703296 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:03.703274 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4fd3994_9933_4354_aff4_2baed763eb94.slice/crio-954c8c8d69cb093fc5ce28726ae873969e030b3d0b2f091958b999cd78b8982f WatchSource:0}: Error finding container 954c8c8d69cb093fc5ce28726ae873969e030b3d0b2f091958b999cd78b8982f: Status 404 returned error can't find the container with id 954c8c8d69cb093fc5ce28726ae873969e030b3d0b2f091958b999cd78b8982f Apr 16 18:30:03.705069 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:03.705044 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b204315_a289_4466_91fd_0714100a1752.slice/crio-55ee35e6c1990dd063939e14085fb4e6e26d54932ec8d91513e344b7d205e289 WatchSource:0}: Error finding container 55ee35e6c1990dd063939e14085fb4e6e26d54932ec8d91513e344b7d205e289: Status 404 returned error can't find the container with id 55ee35e6c1990dd063939e14085fb4e6e26d54932ec8d91513e344b7d205e289 Apr 16 18:30:03.706624 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:03.706551 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3f2c90c_d5bc_430b_8c49_774156034361.slice/crio-57312ad6ed794d71fe251fda0691ed637b87e3ad03b38e63d70c9f04b6dd87a2 WatchSource:0}: Error finding container 57312ad6ed794d71fe251fda0691ed637b87e3ad03b38e63d70c9f04b6dd87a2: Status 404 returned error can't find the container with id 57312ad6ed794d71fe251fda0691ed637b87e3ad03b38e63d70c9f04b6dd87a2 Apr 16 18:30:03.707752 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:03.707731 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfdfe9c3_61ab_4681_93fd_547837fc60cf.slice/crio-c17c4161ec6824ad3e116d4a971d1de745b2e5d05b7cc56a8b39e5aa3c43c32e WatchSource:0}: Error finding container c17c4161ec6824ad3e116d4a971d1de745b2e5d05b7cc56a8b39e5aa3c43c32e: Status 404 returned error can't find the container with id c17c4161ec6824ad3e116d4a971d1de745b2e5d05b7cc56a8b39e5aa3c43c32e Apr 16 18:30:03.744789 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.744756 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:25:01 +0000 UTC" deadline="2027-11-20 10:48:31.379735472 +0000 UTC" Apr 16 18:30:03.744789 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.744783 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13984h18m27.634956144s" Apr 16 18:30:03.779515 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.779318 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" event={"ID":"bfdfe9c3-61ab-4681-93fd-547837fc60cf","Type":"ContainerStarted","Data":"c17c4161ec6824ad3e116d4a971d1de745b2e5d05b7cc56a8b39e5aa3c43c32e"} Apr 16 18:30:03.780394 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.780370 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" event={"ID":"93747e20-acab-493c-8520-f3b549e0c240","Type":"ContainerStarted","Data":"cfdb594a076468eb52d062d1f1b04b99399365573d74feb8ad39d9a58b1a08aa"} Apr 16 18:30:03.782317 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.782296 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2z4jk" event={"ID":"9521e1df-4c34-4a19-bce1-983c6712cca8","Type":"ContainerStarted","Data":"1ee8311906802ca84d4d988439deb0e77b8479dcf5ed65eb6089017a8ca45374"} Apr 16 18:30:03.785381 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.785357 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mdxtz" event={"ID":"d3f2c90c-d5bc-430b-8c49-774156034361","Type":"ContainerStarted","Data":"57312ad6ed794d71fe251fda0691ed637b87e3ad03b38e63d70c9f04b6dd87a2"} Apr 16 18:30:03.786560 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.786515 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" event={"ID":"9b204315-a289-4466-91fd-0714100a1752","Type":"ContainerStarted","Data":"55ee35e6c1990dd063939e14085fb4e6e26d54932ec8d91513e344b7d205e289"} Apr 16 18:30:03.787551 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.787534 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zzzv" event={"ID":"e4fd3994-9933-4354-aff4-2baed763eb94","Type":"ContainerStarted","Data":"954c8c8d69cb093fc5ce28726ae873969e030b3d0b2f091958b999cd78b8982f"} Apr 16 18:30:03.788466 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.788446 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dqknm" event={"ID":"42dd4370-c71a-4351-88f5-8f1f146f3846","Type":"ContainerStarted","Data":"85a4f9cace5cd2c65d9fcbeb942e70dba65ef4cc33b4567d0ac709b7b86cb3de"} Apr 16 18:30:03.789405 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.789383 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sqh7m" event={"ID":"ccad455d-b7e3-4ad9-b224-de9cedc28cb3","Type":"ContainerStarted","Data":"209f4f20cf3866580ae5e1a661503a2b0a263fc33db862d0e63682afeb912229"} Apr 16 18:30:03.797964 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:03.797913 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-166.ec2.internal" podStartSLOduration=2.7979033319999997 podStartE2EDuration="2.797903332s" podCreationTimestamp="2026-04-16 18:30:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:03.79757547 +0000 UTC m=+3.515432733" watchObservedRunningTime="2026-04-16 18:30:03.797903332 +0000 UTC m=+3.515760595" Apr 16 18:30:04.324173 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:04.324143 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs\") pod \"network-metrics-daemon-kc2vf\" (UID: \"1d7d2281-07bb-4906-844c-f53fbfe57143\") " pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:04.324339 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:04.324292 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:04.324414 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:04.324360 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs podName:1d7d2281-07bb-4906-844c-f53fbfe57143 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:06.324340299 +0000 UTC m=+6.042197558 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs") pod "network-metrics-daemon-kc2vf" (UID: "1d7d2281-07bb-4906-844c-f53fbfe57143") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:04.526217 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:04.526115 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z89tb\" (UniqueName: \"kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb\") pod \"network-check-target-vdcwk\" (UID: \"f3ade13f-7d4c-4574-bfc3-a946ccc0dd37\") " pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:04.526371 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:04.526263 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:04.526371 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:04.526283 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:04.526371 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:04.526295 2570 projected.go:194] Error preparing data for projected volume kube-api-access-z89tb for pod openshift-network-diagnostics/network-check-target-vdcwk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:04.526371 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:04.526351 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb podName:f3ade13f-7d4c-4574-bfc3-a946ccc0dd37 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:06.526332162 +0000 UTC m=+6.244189419 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-z89tb" (UniqueName: "kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb") pod "network-check-target-vdcwk" (UID: "f3ade13f-7d4c-4574-bfc3-a946ccc0dd37") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:04.771105 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:04.771071 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:04.771551 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:04.771207 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kc2vf" podUID="1d7d2281-07bb-4906-844c-f53fbfe57143" Apr 16 18:30:04.771637 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:04.771619 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:04.771731 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:04.771712 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdcwk" podUID="f3ade13f-7d4c-4574-bfc3-a946ccc0dd37" Apr 16 18:30:04.795782 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:04.795242 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-166.ec2.internal" event={"ID":"ca19b534e720fdd7bc90ad9dfbc6cf32","Type":"ContainerStarted","Data":"19912ba01543841d965bde95bbb0b95d7b9913f6436c2523a3824550e54f0b13"} Apr 16 18:30:04.799247 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:04.798744 2570 generic.go:358] "Generic (PLEG): container finished" podID="33b16aa1b3125d9e0d75367d83187958" containerID="2be3f3c589a7c022205dbb9aa8b1d63037e12add63faa5f8125f72091d759f62" exitCode=0 Apr 16 18:30:04.799247 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:04.798783 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal" event={"ID":"33b16aa1b3125d9e0d75367d83187958","Type":"ContainerDied","Data":"2be3f3c589a7c022205dbb9aa8b1d63037e12add63faa5f8125f72091d759f62"} Apr 16 18:30:05.808102 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:05.808062 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal" event={"ID":"33b16aa1b3125d9e0d75367d83187958","Type":"ContainerStarted","Data":"88d1c9dcde7f8b6935017434d494af9c66c7e08aaf7e544a72acf12cc0f3608d"} Apr 16 18:30:06.341151 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:06.340570 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs\") pod \"network-metrics-daemon-kc2vf\" (UID: \"1d7d2281-07bb-4906-844c-f53fbfe57143\") " pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:06.341151 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:06.340714 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:06.341151 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:06.340782 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs podName:1d7d2281-07bb-4906-844c-f53fbfe57143 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:10.340761797 +0000 UTC m=+10.058619040 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs") pod "network-metrics-daemon-kc2vf" (UID: "1d7d2281-07bb-4906-844c-f53fbfe57143") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:06.541894 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:06.541849 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z89tb\" (UniqueName: \"kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb\") pod \"network-check-target-vdcwk\" (UID: \"f3ade13f-7d4c-4574-bfc3-a946ccc0dd37\") " pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:06.542127 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:06.542107 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:06.542211 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:06.542134 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:06.542211 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:06.542147 2570 projected.go:194] Error preparing data for projected volume kube-api-access-z89tb for pod openshift-network-diagnostics/network-check-target-vdcwk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:06.542315 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:06.542215 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb podName:f3ade13f-7d4c-4574-bfc3-a946ccc0dd37 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:10.542194479 +0000 UTC m=+10.260051723 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-z89tb" (UniqueName: "kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb") pod "network-check-target-vdcwk" (UID: "f3ade13f-7d4c-4574-bfc3-a946ccc0dd37") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:06.772156 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:06.772082 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:06.772357 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:06.772250 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kc2vf" podUID="1d7d2281-07bb-4906-844c-f53fbfe57143" Apr 16 18:30:06.772618 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:06.772084 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:06.772721 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:06.772699 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdcwk" podUID="f3ade13f-7d4c-4574-bfc3-a946ccc0dd37" Apr 16 18:30:08.025064 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:08.024894 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-166.ec2.internal" podStartSLOduration=7.024871882 podStartE2EDuration="7.024871882s" podCreationTimestamp="2026-04-16 18:30:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:05.823087445 +0000 UTC m=+5.540944709" watchObservedRunningTime="2026-04-16 18:30:08.024871882 +0000 UTC m=+7.742729159" Apr 16 18:30:08.025537 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:08.025468 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-cljk9"] Apr 16 18:30:08.027654 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:08.027627 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cljk9" Apr 16 18:30:08.030544 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:08.030394 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:30:08.030753 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:08.030736 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:30:08.030959 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:08.030944 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-xnp2l\"" Apr 16 18:30:08.055439 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:08.055397 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3cdc1460-1781-45b3-ad12-0173537882af-hosts-file\") pod \"node-resolver-cljk9\" (UID: \"3cdc1460-1781-45b3-ad12-0173537882af\") " pod="openshift-dns/node-resolver-cljk9" Apr 16 18:30:08.055577 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:08.055483 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3cdc1460-1781-45b3-ad12-0173537882af-tmp-dir\") pod \"node-resolver-cljk9\" (UID: \"3cdc1460-1781-45b3-ad12-0173537882af\") " pod="openshift-dns/node-resolver-cljk9" Apr 16 18:30:08.055577 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:08.055526 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h42vv\" (UniqueName: \"kubernetes.io/projected/3cdc1460-1781-45b3-ad12-0173537882af-kube-api-access-h42vv\") pod \"node-resolver-cljk9\" (UID: \"3cdc1460-1781-45b3-ad12-0173537882af\") " pod="openshift-dns/node-resolver-cljk9" Apr 16 18:30:08.156802 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:08.156765 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3cdc1460-1781-45b3-ad12-0173537882af-tmp-dir\") pod \"node-resolver-cljk9\" (UID: \"3cdc1460-1781-45b3-ad12-0173537882af\") " pod="openshift-dns/node-resolver-cljk9" Apr 16 18:30:08.156987 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:08.156832 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h42vv\" (UniqueName: \"kubernetes.io/projected/3cdc1460-1781-45b3-ad12-0173537882af-kube-api-access-h42vv\") pod \"node-resolver-cljk9\" (UID: \"3cdc1460-1781-45b3-ad12-0173537882af\") " pod="openshift-dns/node-resolver-cljk9" Apr 16 18:30:08.156987 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:08.156972 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3cdc1460-1781-45b3-ad12-0173537882af-hosts-file\") pod \"node-resolver-cljk9\" (UID: \"3cdc1460-1781-45b3-ad12-0173537882af\") " pod="openshift-dns/node-resolver-cljk9" Apr 16 18:30:08.157113 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:08.157085 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3cdc1460-1781-45b3-ad12-0173537882af-hosts-file\") pod \"node-resolver-cljk9\" (UID: \"3cdc1460-1781-45b3-ad12-0173537882af\") " pod="openshift-dns/node-resolver-cljk9" Apr 16 18:30:08.157159 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:08.157131 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3cdc1460-1781-45b3-ad12-0173537882af-tmp-dir\") pod \"node-resolver-cljk9\" (UID: \"3cdc1460-1781-45b3-ad12-0173537882af\") " pod="openshift-dns/node-resolver-cljk9" Apr 16 18:30:08.175102 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:08.175044 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h42vv\" (UniqueName: \"kubernetes.io/projected/3cdc1460-1781-45b3-ad12-0173537882af-kube-api-access-h42vv\") pod \"node-resolver-cljk9\" (UID: \"3cdc1460-1781-45b3-ad12-0173537882af\") " pod="openshift-dns/node-resolver-cljk9" Apr 16 18:30:08.339838 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:08.339797 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cljk9" Apr 16 18:30:08.771503 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:08.771424 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:08.771657 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:08.771432 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:08.771657 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:08.771553 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdcwk" podUID="f3ade13f-7d4c-4574-bfc3-a946ccc0dd37" Apr 16 18:30:08.771657 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:08.771629 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kc2vf" podUID="1d7d2281-07bb-4906-844c-f53fbfe57143" Apr 16 18:30:10.377205 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:10.377168 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs\") pod \"network-metrics-daemon-kc2vf\" (UID: \"1d7d2281-07bb-4906-844c-f53fbfe57143\") " pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:10.377716 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:10.377319 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:10.377716 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:10.377382 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs podName:1d7d2281-07bb-4906-844c-f53fbfe57143 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:18.377363602 +0000 UTC m=+18.095220848 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs") pod "network-metrics-daemon-kc2vf" (UID: "1d7d2281-07bb-4906-844c-f53fbfe57143") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:10.578579 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:10.578537 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z89tb\" (UniqueName: \"kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb\") pod \"network-check-target-vdcwk\" (UID: \"f3ade13f-7d4c-4574-bfc3-a946ccc0dd37\") " pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:10.578770 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:10.578733 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:10.578770 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:10.578760 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:10.578883 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:10.578775 2570 projected.go:194] Error preparing data for projected volume kube-api-access-z89tb for pod openshift-network-diagnostics/network-check-target-vdcwk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:10.578883 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:10.578844 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb podName:f3ade13f-7d4c-4574-bfc3-a946ccc0dd37 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:18.57882454 +0000 UTC m=+18.296681796 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-z89tb" (UniqueName: "kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb") pod "network-check-target-vdcwk" (UID: "f3ade13f-7d4c-4574-bfc3-a946ccc0dd37") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:10.771635 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:10.771562 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:10.771782 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:10.771661 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kc2vf" podUID="1d7d2281-07bb-4906-844c-f53fbfe57143" Apr 16 18:30:10.772173 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:10.772146 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:10.772268 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:10.772247 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdcwk" podUID="f3ade13f-7d4c-4574-bfc3-a946ccc0dd37" Apr 16 18:30:12.770726 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:12.770642 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:12.770726 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:12.770637 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:12.771247 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:12.770782 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kc2vf" podUID="1d7d2281-07bb-4906-844c-f53fbfe57143" Apr 16 18:30:12.771247 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:12.770933 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdcwk" podUID="f3ade13f-7d4c-4574-bfc3-a946ccc0dd37" Apr 16 18:30:14.771286 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:14.771246 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:14.771286 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:14.771290 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:14.771802 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:14.771367 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kc2vf" podUID="1d7d2281-07bb-4906-844c-f53fbfe57143" Apr 16 18:30:14.771802 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:14.771523 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdcwk" podUID="f3ade13f-7d4c-4574-bfc3-a946ccc0dd37" Apr 16 18:30:16.771191 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:16.771152 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:16.771191 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:16.771190 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:16.771652 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:16.771293 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kc2vf" podUID="1d7d2281-07bb-4906-844c-f53fbfe57143" Apr 16 18:30:16.771652 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:16.771429 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdcwk" podUID="f3ade13f-7d4c-4574-bfc3-a946ccc0dd37" Apr 16 18:30:18.430774 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:18.430740 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs\") pod \"network-metrics-daemon-kc2vf\" (UID: \"1d7d2281-07bb-4906-844c-f53fbfe57143\") " pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:18.431309 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:18.430874 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:18.431309 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:18.430946 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs podName:1d7d2281-07bb-4906-844c-f53fbfe57143 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:34.430914991 +0000 UTC m=+34.148772231 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs") pod "network-metrics-daemon-kc2vf" (UID: "1d7d2281-07bb-4906-844c-f53fbfe57143") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:18.632837 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:18.632802 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z89tb\" (UniqueName: \"kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb\") pod \"network-check-target-vdcwk\" (UID: \"f3ade13f-7d4c-4574-bfc3-a946ccc0dd37\") " pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:18.633004 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:18.632981 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:18.633051 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:18.633006 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:18.633051 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:18.633019 2570 projected.go:194] Error preparing data for projected volume kube-api-access-z89tb for pod openshift-network-diagnostics/network-check-target-vdcwk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:18.633123 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:18.633086 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb podName:f3ade13f-7d4c-4574-bfc3-a946ccc0dd37 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:34.63306736 +0000 UTC m=+34.350924610 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-z89tb" (UniqueName: "kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb") pod "network-check-target-vdcwk" (UID: "f3ade13f-7d4c-4574-bfc3-a946ccc0dd37") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:18.770522 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:18.770445 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:18.770673 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:18.770561 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdcwk" podUID="f3ade13f-7d4c-4574-bfc3-a946ccc0dd37" Apr 16 18:30:18.770673 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:18.770614 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:18.770785 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:18.770678 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kc2vf" podUID="1d7d2281-07bb-4906-844c-f53fbfe57143" Apr 16 18:30:20.596546 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:20.596511 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cdc1460_1781_45b3_ad12_0173537882af.slice/crio-74072f495652f1af6f77392fe8fccc48d19317fabb7d52ed59f80a212c509356 WatchSource:0}: Error finding container 74072f495652f1af6f77392fe8fccc48d19317fabb7d52ed59f80a212c509356: Status 404 returned error can't find the container with id 74072f495652f1af6f77392fe8fccc48d19317fabb7d52ed59f80a212c509356 Apr 16 18:30:20.773540 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:20.773006 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:20.773540 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:20.773094 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdcwk" podUID="f3ade13f-7d4c-4574-bfc3-a946ccc0dd37" Apr 16 18:30:20.773716 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:20.773587 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:20.773716 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:20.773695 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kc2vf" podUID="1d7d2281-07bb-4906-844c-f53fbfe57143" Apr 16 18:30:20.841333 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:20.841099 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" event={"ID":"93747e20-acab-493c-8520-f3b549e0c240","Type":"ContainerStarted","Data":"241d986d92b0b3e595ca8906efe064aad911787dabc3fea0ff9675b8fcf2422c"} Apr 16 18:30:20.849545 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:20.849252 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cljk9" event={"ID":"3cdc1460-1781-45b3-ad12-0173537882af","Type":"ContainerStarted","Data":"74072f495652f1af6f77392fe8fccc48d19317fabb7d52ed59f80a212c509356"} Apr 16 18:30:21.859702 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.859521 2570 generic.go:358] "Generic (PLEG): container finished" podID="e4fd3994-9933-4354-aff4-2baed763eb94" containerID="f4d01a44f3a5cdd8db07600b889004ec0695ecccf5dbbf5a0a01da34074999d7" exitCode=0 Apr 16 18:30:21.860393 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.859603 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zzzv" event={"ID":"e4fd3994-9933-4354-aff4-2baed763eb94","Type":"ContainerDied","Data":"f4d01a44f3a5cdd8db07600b889004ec0695ecccf5dbbf5a0a01da34074999d7"} Apr 16 18:30:21.861179 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.861140 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dqknm" event={"ID":"42dd4370-c71a-4351-88f5-8f1f146f3846","Type":"ContainerStarted","Data":"92534c09f619c374d8f31ac58d0bf426fea1cee03f267410c26104f94b0d82f5"} Apr 16 18:30:21.862399 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.862373 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sqh7m" event={"ID":"ccad455d-b7e3-4ad9-b224-de9cedc28cb3","Type":"ContainerStarted","Data":"f3091f3cf596105d8c632d21110916c98b7f8c7c2d9115e21c37d7e272958cb0"} Apr 16 18:30:21.863728 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.863699 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" event={"ID":"bfdfe9c3-61ab-4681-93fd-547837fc60cf","Type":"ContainerStarted","Data":"6aba4edffd5dd697ff96e250dacdd6eed589d4b1eb32985cb95e108f334f3bd7"} Apr 16 18:30:21.864910 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.864891 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2z4jk" event={"ID":"9521e1df-4c34-4a19-bce1-983c6712cca8","Type":"ContainerStarted","Data":"fa8d357865b29c1e085063a369216ad574aa3bebbfd586bf3e39f60c465fa831"} Apr 16 18:30:21.866140 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.866109 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cljk9" event={"ID":"3cdc1460-1781-45b3-ad12-0173537882af","Type":"ContainerStarted","Data":"2500b5ab9df51aee521404ad7bbe560b61cb7bc042abd07cfb13c659725b01a9"} Apr 16 18:30:21.867316 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.867292 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mdxtz" event={"ID":"d3f2c90c-d5bc-430b-8c49-774156034361","Type":"ContainerStarted","Data":"f1ccc59c694c4019c7826f4106397d92b3ddb7f21101d6bf653eb24b3d52090a"} Apr 16 18:30:21.869443 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.869427 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovn-acl-logging/0.log" Apr 16 18:30:21.869724 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.869706 2570 generic.go:358] "Generic (PLEG): container finished" podID="9b204315-a289-4466-91fd-0714100a1752" containerID="5519e9d4c05a97998d2ad3b52671fc6b5acdc45bfcac7ac8008e0483bab8ca86" exitCode=1 Apr 16 18:30:21.869789 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.869772 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" event={"ID":"9b204315-a289-4466-91fd-0714100a1752","Type":"ContainerStarted","Data":"b008208859ae11401345a053a5da0d9dd57297f2af53956671539c76b32b6be2"} Apr 16 18:30:21.869848 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.869800 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" event={"ID":"9b204315-a289-4466-91fd-0714100a1752","Type":"ContainerStarted","Data":"1d15cfc642e47e508d26418f5ac5df72b13b3d90fdc9d94c93a096d621e5d372"} Apr 16 18:30:21.869848 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.869815 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" event={"ID":"9b204315-a289-4466-91fd-0714100a1752","Type":"ContainerStarted","Data":"c32783424dc8f5a93bf3b60a7ae5edbe6a40fdb90f8d3ccf94fcad71089150f9"} Apr 16 18:30:21.869848 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.869828 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" event={"ID":"9b204315-a289-4466-91fd-0714100a1752","Type":"ContainerStarted","Data":"f91b50310afaa4bed16b9409d40b096a296a6b1a0e0f9865011c533907e9eb5b"} Apr 16 18:30:21.869848 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.869841 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" event={"ID":"9b204315-a289-4466-91fd-0714100a1752","Type":"ContainerStarted","Data":"0c06cffc27326bebb7607d14fa0dae3956252dfadc30cfea30d5be3726a4346a"} Apr 16 18:30:21.870003 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.869853 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" event={"ID":"9b204315-a289-4466-91fd-0714100a1752","Type":"ContainerDied","Data":"5519e9d4c05a97998d2ad3b52671fc6b5acdc45bfcac7ac8008e0483bab8ca86"} Apr 16 18:30:21.893832 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.893785 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-dwhd9" podStartSLOduration=4.978197462 podStartE2EDuration="21.893771245s" podCreationTimestamp="2026-04-16 18:30:00 +0000 UTC" firstStartedPulling="2026-04-16 18:30:03.703805946 +0000 UTC m=+3.421663186" lastFinishedPulling="2026-04-16 18:30:20.619379715 +0000 UTC m=+20.337236969" observedRunningTime="2026-04-16 18:30:20.864236838 +0000 UTC m=+20.582094101" watchObservedRunningTime="2026-04-16 18:30:21.893771245 +0000 UTC m=+21.611628484" Apr 16 18:30:21.925093 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.925042 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cljk9" podStartSLOduration=13.925028627 podStartE2EDuration="13.925028627s" podCreationTimestamp="2026-04-16 18:30:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:21.924701541 +0000 UTC m=+21.642558803" watchObservedRunningTime="2026-04-16 18:30:21.925028627 +0000 UTC m=+21.642885889" Apr 16 18:30:21.925355 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.925327 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-mdxtz" podStartSLOduration=4.033458296 podStartE2EDuration="20.925319007s" podCreationTimestamp="2026-04-16 18:30:01 +0000 UTC" firstStartedPulling="2026-04-16 18:30:03.708319001 +0000 UTC m=+3.426176245" lastFinishedPulling="2026-04-16 18:30:20.600179713 +0000 UTC m=+20.318036956" observedRunningTime="2026-04-16 18:30:21.910166461 +0000 UTC m=+21.628023723" watchObservedRunningTime="2026-04-16 18:30:21.925319007 +0000 UTC m=+21.643176306" Apr 16 18:30:21.939307 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.939257 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-dqknm" podStartSLOduration=5.039961908 podStartE2EDuration="21.939244873s" podCreationTimestamp="2026-04-16 18:30:00 +0000 UTC" firstStartedPulling="2026-04-16 18:30:03.70089552 +0000 UTC m=+3.418752774" lastFinishedPulling="2026-04-16 18:30:20.600178483 +0000 UTC m=+20.318035739" observedRunningTime="2026-04-16 18:30:21.938899145 +0000 UTC m=+21.656756407" watchObservedRunningTime="2026-04-16 18:30:21.939244873 +0000 UTC m=+21.657102136" Apr 16 18:30:21.953914 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.953866 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2z4jk" podStartSLOduration=5.036292422 podStartE2EDuration="21.953854294s" podCreationTimestamp="2026-04-16 18:30:00 +0000 UTC" firstStartedPulling="2026-04-16 18:30:03.702044024 +0000 UTC m=+3.419901264" lastFinishedPulling="2026-04-16 18:30:20.619605883 +0000 UTC m=+20.337463136" observedRunningTime="2026-04-16 18:30:21.953808331 +0000 UTC m=+21.671665592" watchObservedRunningTime="2026-04-16 18:30:21.953854294 +0000 UTC m=+21.671711556" Apr 16 18:30:21.973305 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:21.973265 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-sqh7m" podStartSLOduration=4.743917802 podStartE2EDuration="21.973253123s" podCreationTimestamp="2026-04-16 18:30:00 +0000 UTC" firstStartedPulling="2026-04-16 18:30:03.699268838 +0000 UTC m=+3.417126094" lastFinishedPulling="2026-04-16 18:30:20.92860416 +0000 UTC m=+20.646461415" observedRunningTime="2026-04-16 18:30:21.972901006 +0000 UTC m=+21.690758267" watchObservedRunningTime="2026-04-16 18:30:21.973253123 +0000 UTC m=+21.691110384" Apr 16 18:30:22.346850 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:22.346807 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:30:22.771293 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:22.770747 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:22.771293 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:22.770887 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kc2vf" podUID="1d7d2281-07bb-4906-844c-f53fbfe57143" Apr 16 18:30:22.771293 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:22.770947 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:22.771293 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:22.771044 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdcwk" podUID="f3ade13f-7d4c-4574-bfc3-a946ccc0dd37" Apr 16 18:30:22.773430 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:22.773346 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:30:22.346829983Z","UUID":"1c17e759-4f89-480e-8b93-a2a3b870e0e4","Handler":null,"Name":"","Endpoint":""} Apr 16 18:30:22.777534 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:22.777509 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:30:22.777687 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:22.777543 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:30:22.874845 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:22.874809 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" event={"ID":"bfdfe9c3-61ab-4681-93fd-547837fc60cf","Type":"ContainerStarted","Data":"685092f801a1a3472229ca71f3632c51e2d2a70bbe3ae70b0b572129a485143d"} Apr 16 18:30:23.879298 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:23.879026 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" event={"ID":"bfdfe9c3-61ab-4681-93fd-547837fc60cf","Type":"ContainerStarted","Data":"56e80ef22915ab73744d951c2801d23059bd8bb8b51b225a08fe739c032d93a9"} Apr 16 18:30:23.882188 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:23.882164 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovn-acl-logging/0.log" Apr 16 18:30:23.882570 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:23.882534 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" event={"ID":"9b204315-a289-4466-91fd-0714100a1752","Type":"ContainerStarted","Data":"c9ef6eda3a0e677a5dd356e07c16f02de8ddea857cbe9f2aae68586707a09e5b"} Apr 16 18:30:23.900302 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:23.900239 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fp65k" podStartSLOduration=4.038959188 podStartE2EDuration="23.90022076s" podCreationTimestamp="2026-04-16 18:30:00 +0000 UTC" firstStartedPulling="2026-04-16 18:30:03.709483248 +0000 UTC m=+3.427340503" lastFinishedPulling="2026-04-16 18:30:23.570744835 +0000 UTC m=+23.288602075" observedRunningTime="2026-04-16 18:30:23.899450149 +0000 UTC m=+23.617307408" watchObservedRunningTime="2026-04-16 18:30:23.90022076 +0000 UTC m=+23.618078023" Apr 16 18:30:24.771096 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:24.771060 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:24.771310 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:24.771068 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:24.771310 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:24.771185 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kc2vf" podUID="1d7d2281-07bb-4906-844c-f53fbfe57143" Apr 16 18:30:24.771310 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:24.771244 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdcwk" podUID="f3ade13f-7d4c-4574-bfc3-a946ccc0dd37" Apr 16 18:30:24.801335 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:24.801296 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-dqknm" Apr 16 18:30:24.801986 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:24.801966 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-dqknm" Apr 16 18:30:25.890237 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:25.890212 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovn-acl-logging/0.log" Apr 16 18:30:26.771233 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:26.771053 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:26.771424 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:26.771111 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:26.771424 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:26.771305 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdcwk" podUID="f3ade13f-7d4c-4574-bfc3-a946ccc0dd37" Apr 16 18:30:26.771424 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:26.771402 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kc2vf" podUID="1d7d2281-07bb-4906-844c-f53fbfe57143" Apr 16 18:30:26.894699 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:26.894669 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovn-acl-logging/0.log" Apr 16 18:30:26.895119 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:26.894953 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" event={"ID":"9b204315-a289-4466-91fd-0714100a1752","Type":"ContainerStarted","Data":"cb3c595a01de3ce716181a567480851e350b79f16c2e5d90ae17a6540edc9482"} Apr 16 18:30:26.895334 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:26.895310 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:26.895334 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:26.895342 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:26.895486 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:26.895412 2570 scope.go:117] "RemoveContainer" containerID="5519e9d4c05a97998d2ad3b52671fc6b5acdc45bfcac7ac8008e0483bab8ca86" Apr 16 18:30:26.896700 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:26.896675 2570 generic.go:358] "Generic (PLEG): container finished" podID="e4fd3994-9933-4354-aff4-2baed763eb94" containerID="a8aa36ca7d7801e180b359dcaba8b2338146238f524d7eca182def068aca09c8" exitCode=0 Apr 16 18:30:26.896760 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:26.896721 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zzzv" event={"ID":"e4fd3994-9933-4354-aff4-2baed763eb94","Type":"ContainerDied","Data":"a8aa36ca7d7801e180b359dcaba8b2338146238f524d7eca182def068aca09c8"} Apr 16 18:30:26.912343 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:26.912326 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:27.814954 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:27.814857 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kc2vf"] Apr 16 18:30:27.815184 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:27.815018 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:27.815184 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:27.815103 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kc2vf" podUID="1d7d2281-07bb-4906-844c-f53fbfe57143" Apr 16 18:30:27.817452 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:27.817429 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vdcwk"] Apr 16 18:30:27.817563 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:27.817533 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:27.817618 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:27.817601 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdcwk" podUID="f3ade13f-7d4c-4574-bfc3-a946ccc0dd37" Apr 16 18:30:27.902255 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:27.902231 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovn-acl-logging/0.log" Apr 16 18:30:27.902643 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:27.902546 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" event={"ID":"9b204315-a289-4466-91fd-0714100a1752","Type":"ContainerStarted","Data":"57c97226bf3274f0b86b8695a9341c318b9bce536a1c8881936d20d7373f8ec7"} Apr 16 18:30:27.902875 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:27.902848 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:27.904524 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:27.904500 2570 generic.go:358] "Generic (PLEG): container finished" podID="e4fd3994-9933-4354-aff4-2baed763eb94" containerID="0bac3bb984eff5f266329d4bbd33b04700bf044c0b8663f611ca28750ca3a58e" exitCode=0 Apr 16 18:30:27.904636 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:27.904544 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zzzv" event={"ID":"e4fd3994-9933-4354-aff4-2baed763eb94","Type":"ContainerDied","Data":"0bac3bb984eff5f266329d4bbd33b04700bf044c0b8663f611ca28750ca3a58e"} Apr 16 18:30:27.918260 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:27.918107 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:30:27.934522 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:27.934482 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" podStartSLOduration=10.976490121 podStartE2EDuration="27.934470431s" podCreationTimestamp="2026-04-16 18:30:00 +0000 UTC" firstStartedPulling="2026-04-16 18:30:03.707109162 +0000 UTC m=+3.424966405" lastFinishedPulling="2026-04-16 18:30:20.665089457 +0000 UTC m=+20.382946715" observedRunningTime="2026-04-16 18:30:27.932760333 +0000 UTC m=+27.650617594" watchObservedRunningTime="2026-04-16 18:30:27.934470431 +0000 UTC m=+27.652327671" Apr 16 18:30:28.908158 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:28.908125 2570 generic.go:358] "Generic (PLEG): container finished" podID="e4fd3994-9933-4354-aff4-2baed763eb94" containerID="894be9546434f7a431da0fece44c011c0ec6c6de380523223e3f603aabc476ea" exitCode=0 Apr 16 18:30:28.908527 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:28.908205 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zzzv" event={"ID":"e4fd3994-9933-4354-aff4-2baed763eb94","Type":"ContainerDied","Data":"894be9546434f7a431da0fece44c011c0ec6c6de380523223e3f603aabc476ea"} Apr 16 18:30:29.771080 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:29.771039 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:29.771260 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:29.771047 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:29.771260 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:29.771189 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kc2vf" podUID="1d7d2281-07bb-4906-844c-f53fbfe57143" Apr 16 18:30:29.771343 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:29.771318 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdcwk" podUID="f3ade13f-7d4c-4574-bfc3-a946ccc0dd37" Apr 16 18:30:30.776180 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:30.776103 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-dqknm" Apr 16 18:30:30.776711 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:30.776244 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:30:30.776786 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:30.776767 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-dqknm" Apr 16 18:30:31.770894 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:31.770854 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:31.771093 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:31.770905 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:31.771093 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:31.771024 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vdcwk" podUID="f3ade13f-7d4c-4574-bfc3-a946ccc0dd37" Apr 16 18:30:31.771214 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:31.771138 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kc2vf" podUID="1d7d2281-07bb-4906-844c-f53fbfe57143" Apr 16 18:30:33.601465 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.601428 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-166.ec2.internal" event="NodeReady" Apr 16 18:30:33.602065 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.601589 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:30:33.671402 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.671323 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lp728"] Apr 16 18:30:33.673880 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.673851 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lp728" Apr 16 18:30:33.675331 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.675303 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-s87rb"] Apr 16 18:30:33.677320 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.677278 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:30:33.677896 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.677855 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:30:33.677896 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.677855 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:30:33.680445 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.680424 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:30:33.680627 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.680607 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2pvvf\"" Apr 16 18:30:33.680721 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.680648 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:30:33.681999 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.681977 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:30:33.682103 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.682048 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-h4pkb\"" Apr 16 18:30:33.696496 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.696476 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-s87rb"] Apr 16 18:30:33.703444 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.703423 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lp728"] Apr 16 18:30:33.743194 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.743163 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m42f\" (UniqueName: \"kubernetes.io/projected/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-kube-api-access-4m42f\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:30:33.743194 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.743198 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-tmp-dir\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:30:33.743427 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.743221 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:30:33.743427 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.743286 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert\") pod \"ingress-canary-s87rb\" (UID: \"a25871f6-0ad2-44ac-9f9c-492a30345e0e\") " pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:30:33.743427 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.743388 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nqpf\" (UniqueName: \"kubernetes.io/projected/a25871f6-0ad2-44ac-9f9c-492a30345e0e-kube-api-access-5nqpf\") pod \"ingress-canary-s87rb\" (UID: \"a25871f6-0ad2-44ac-9f9c-492a30345e0e\") " pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:30:33.743565 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.743510 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-config-volume\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:30:33.771250 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.771209 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:33.771442 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.771226 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:33.774526 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.774325 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:30:33.774526 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.774335 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-pk24s\"" Apr 16 18:30:33.774526 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.774390 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:30:33.774750 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.774654 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:30:33.774750 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.774711 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wddp7\"" Apr 16 18:30:33.844486 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.844452 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert\") pod \"ingress-canary-s87rb\" (UID: \"a25871f6-0ad2-44ac-9f9c-492a30345e0e\") " pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:30:33.844486 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.844492 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nqpf\" (UniqueName: \"kubernetes.io/projected/a25871f6-0ad2-44ac-9f9c-492a30345e0e-kube-api-access-5nqpf\") pod \"ingress-canary-s87rb\" (UID: \"a25871f6-0ad2-44ac-9f9c-492a30345e0e\") " pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:30:33.844737 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:33.844621 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:30:33.844737 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.844640 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-config-volume\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:30:33.844737 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:33.844688 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert podName:a25871f6-0ad2-44ac-9f9c-492a30345e0e nodeName:}" failed. No retries permitted until 2026-04-16 18:30:34.34466867 +0000 UTC m=+34.062525923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert") pod "ingress-canary-s87rb" (UID: "a25871f6-0ad2-44ac-9f9c-492a30345e0e") : secret "canary-serving-cert" not found Apr 16 18:30:33.844901 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.844749 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4m42f\" (UniqueName: \"kubernetes.io/projected/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-kube-api-access-4m42f\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:30:33.844901 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.844787 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-tmp-dir\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:30:33.844901 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.844822 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:30:33.845121 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:33.844946 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:30:33.845121 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:33.844988 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls podName:8529cef7-b4bb-4d9b-9a9d-cd0b821f2437 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:34.344973322 +0000 UTC m=+34.062830576 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls") pod "dns-default-lp728" (UID: "8529cef7-b4bb-4d9b-9a9d-cd0b821f2437") : secret "dns-default-metrics-tls" not found Apr 16 18:30:33.845121 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.845070 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-tmp-dir\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:30:33.845222 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.845203 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-config-volume\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:30:33.855404 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.855381 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m42f\" (UniqueName: \"kubernetes.io/projected/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-kube-api-access-4m42f\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:30:33.855524 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:33.855512 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nqpf\" (UniqueName: \"kubernetes.io/projected/a25871f6-0ad2-44ac-9f9c-492a30345e0e-kube-api-access-5nqpf\") pod \"ingress-canary-s87rb\" (UID: \"a25871f6-0ad2-44ac-9f9c-492a30345e0e\") " pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:30:34.347582 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:34.347544 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:30:34.347772 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:34.347603 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert\") pod \"ingress-canary-s87rb\" (UID: \"a25871f6-0ad2-44ac-9f9c-492a30345e0e\") " pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:30:34.347772 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:34.347716 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:30:34.347772 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:34.347720 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:30:34.347998 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:34.347781 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert podName:a25871f6-0ad2-44ac-9f9c-492a30345e0e nodeName:}" failed. No retries permitted until 2026-04-16 18:30:35.34776289 +0000 UTC m=+35.065620130 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert") pod "ingress-canary-s87rb" (UID: "a25871f6-0ad2-44ac-9f9c-492a30345e0e") : secret "canary-serving-cert" not found Apr 16 18:30:34.347998 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:34.347800 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls podName:8529cef7-b4bb-4d9b-9a9d-cd0b821f2437 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:35.347790012 +0000 UTC m=+35.065647253 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls") pod "dns-default-lp728" (UID: "8529cef7-b4bb-4d9b-9a9d-cd0b821f2437") : secret "dns-default-metrics-tls" not found Apr 16 18:30:34.448555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:34.448510 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs\") pod \"network-metrics-daemon-kc2vf\" (UID: \"1d7d2281-07bb-4906-844c-f53fbfe57143\") " pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:30:34.448745 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:34.448692 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:30:34.448805 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:34.448787 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs podName:1d7d2281-07bb-4906-844c-f53fbfe57143 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:06.448765188 +0000 UTC m=+66.166622467 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs") pod "network-metrics-daemon-kc2vf" (UID: "1d7d2281-07bb-4906-844c-f53fbfe57143") : secret "metrics-daemon-secret" not found Apr 16 18:30:34.650773 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:34.650679 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z89tb\" (UniqueName: \"kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb\") pod \"network-check-target-vdcwk\" (UID: \"f3ade13f-7d4c-4574-bfc3-a946ccc0dd37\") " pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:34.653235 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:34.653211 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z89tb\" (UniqueName: \"kubernetes.io/projected/f3ade13f-7d4c-4574-bfc3-a946ccc0dd37-kube-api-access-z89tb\") pod \"network-check-target-vdcwk\" (UID: \"f3ade13f-7d4c-4574-bfc3-a946ccc0dd37\") " pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:34.682024 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:34.681991 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:34.852007 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:34.851824 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vdcwk"] Apr 16 18:30:34.855955 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:34.855895 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3ade13f_7d4c_4574_bfc3_a946ccc0dd37.slice/crio-494e51cefac855408411cf5b854048ae1099687b74349fbfa5c99ed688039f0d WatchSource:0}: Error finding container 494e51cefac855408411cf5b854048ae1099687b74349fbfa5c99ed688039f0d: Status 404 returned error can't find the container with id 494e51cefac855408411cf5b854048ae1099687b74349fbfa5c99ed688039f0d Apr 16 18:30:34.922144 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:34.922105 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zzzv" event={"ID":"e4fd3994-9933-4354-aff4-2baed763eb94","Type":"ContainerStarted","Data":"e3eb7b08645f29d3dab2ab8daafe711c76cb77cbc114e5fc681952162481cb2a"} Apr 16 18:30:34.923242 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:34.923215 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vdcwk" event={"ID":"f3ade13f-7d4c-4574-bfc3-a946ccc0dd37","Type":"ContainerStarted","Data":"494e51cefac855408411cf5b854048ae1099687b74349fbfa5c99ed688039f0d"} Apr 16 18:30:35.356450 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:35.356416 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:30:35.356649 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:35.356471 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert\") pod \"ingress-canary-s87rb\" (UID: \"a25871f6-0ad2-44ac-9f9c-492a30345e0e\") " pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:30:35.356649 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:35.356578 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:30:35.356649 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:35.356642 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert podName:a25871f6-0ad2-44ac-9f9c-492a30345e0e nodeName:}" failed. No retries permitted until 2026-04-16 18:30:37.356622356 +0000 UTC m=+37.074479628 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert") pod "ingress-canary-s87rb" (UID: "a25871f6-0ad2-44ac-9f9c-492a30345e0e") : secret "canary-serving-cert" not found Apr 16 18:30:35.356816 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:35.356577 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:30:35.356816 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:35.356685 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls podName:8529cef7-b4bb-4d9b-9a9d-cd0b821f2437 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:37.356674136 +0000 UTC m=+37.074531377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls") pod "dns-default-lp728" (UID: "8529cef7-b4bb-4d9b-9a9d-cd0b821f2437") : secret "dns-default-metrics-tls" not found Apr 16 18:30:35.930180 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:35.930143 2570 generic.go:358] "Generic (PLEG): container finished" podID="e4fd3994-9933-4354-aff4-2baed763eb94" containerID="e3eb7b08645f29d3dab2ab8daafe711c76cb77cbc114e5fc681952162481cb2a" exitCode=0 Apr 16 18:30:35.930180 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:35.930195 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zzzv" event={"ID":"e4fd3994-9933-4354-aff4-2baed763eb94","Type":"ContainerDied","Data":"e3eb7b08645f29d3dab2ab8daafe711c76cb77cbc114e5fc681952162481cb2a"} Apr 16 18:30:36.935419 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:36.935375 2570 generic.go:358] "Generic (PLEG): container finished" podID="e4fd3994-9933-4354-aff4-2baed763eb94" containerID="f8cc0b95404226fc18a1eb2352cb1b2eb58f14a7e2ee892aaf1cba73cfb77956" exitCode=0 Apr 16 18:30:36.935862 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:36.935442 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zzzv" event={"ID":"e4fd3994-9933-4354-aff4-2baed763eb94","Type":"ContainerDied","Data":"f8cc0b95404226fc18a1eb2352cb1b2eb58f14a7e2ee892aaf1cba73cfb77956"} Apr 16 18:30:37.368791 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:37.368753 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:30:37.368967 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:37.368812 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert\") pod \"ingress-canary-s87rb\" (UID: \"a25871f6-0ad2-44ac-9f9c-492a30345e0e\") " pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:30:37.368967 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:37.368947 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:30:37.368967 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:37.368950 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:30:37.369124 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:37.369013 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert podName:a25871f6-0ad2-44ac-9f9c-492a30345e0e nodeName:}" failed. No retries permitted until 2026-04-16 18:30:41.3689947 +0000 UTC m=+41.086851955 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert") pod "ingress-canary-s87rb" (UID: "a25871f6-0ad2-44ac-9f9c-492a30345e0e") : secret "canary-serving-cert" not found Apr 16 18:30:37.369124 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:37.369032 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls podName:8529cef7-b4bb-4d9b-9a9d-cd0b821f2437 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:41.369022584 +0000 UTC m=+41.086879829 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls") pod "dns-default-lp728" (UID: "8529cef7-b4bb-4d9b-9a9d-cd0b821f2437") : secret "dns-default-metrics-tls" not found Apr 16 18:30:37.939414 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:37.939203 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vdcwk" event={"ID":"f3ade13f-7d4c-4574-bfc3-a946ccc0dd37","Type":"ContainerStarted","Data":"c631de66c2965ef93224dcf09301b4af9ac9ccd4ffa7f36c9ce3a495add1c166"} Apr 16 18:30:37.939841 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:37.939430 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:30:37.942106 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:37.942081 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9zzzv" event={"ID":"e4fd3994-9933-4354-aff4-2baed763eb94","Type":"ContainerStarted","Data":"ae01dc1808b7cf2fc7ef9c11ca99b2dbffc69cb99ca8baae6e4529de2a206b72"} Apr 16 18:30:37.955994 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:37.955956 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vdcwk" podStartSLOduration=34.999856686 podStartE2EDuration="37.955940856s" podCreationTimestamp="2026-04-16 18:30:00 +0000 UTC" firstStartedPulling="2026-04-16 18:30:34.858375542 +0000 UTC m=+34.576232783" lastFinishedPulling="2026-04-16 18:30:37.814459694 +0000 UTC m=+37.532316953" observedRunningTime="2026-04-16 18:30:37.955132021 +0000 UTC m=+37.672989280" watchObservedRunningTime="2026-04-16 18:30:37.955940856 +0000 UTC m=+37.673798111" Apr 16 18:30:37.978577 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:37.978526 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9zzzv" podStartSLOduration=6.947568946 podStartE2EDuration="37.978500125s" podCreationTimestamp="2026-04-16 18:30:00 +0000 UTC" firstStartedPulling="2026-04-16 18:30:03.705100564 +0000 UTC m=+3.422957819" lastFinishedPulling="2026-04-16 18:30:34.736031754 +0000 UTC m=+34.453888998" observedRunningTime="2026-04-16 18:30:37.97798053 +0000 UTC m=+37.695837792" watchObservedRunningTime="2026-04-16 18:30:37.978500125 +0000 UTC m=+37.696357387" Apr 16 18:30:39.108733 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:39.108694 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-cqphb"] Apr 16 18:30:39.151782 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:39.151746 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cqphb"] Apr 16 18:30:39.151954 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:39.151886 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cqphb" Apr 16 18:30:39.154883 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:39.154859 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:30:39.281906 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:39.281870 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/79a42a16-0e76-463a-87f0-ca53a4f24aa2-dbus\") pod \"global-pull-secret-syncer-cqphb\" (UID: \"79a42a16-0e76-463a-87f0-ca53a4f24aa2\") " pod="kube-system/global-pull-secret-syncer-cqphb" Apr 16 18:30:39.281906 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:39.281906 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/79a42a16-0e76-463a-87f0-ca53a4f24aa2-original-pull-secret\") pod \"global-pull-secret-syncer-cqphb\" (UID: \"79a42a16-0e76-463a-87f0-ca53a4f24aa2\") " pod="kube-system/global-pull-secret-syncer-cqphb" Apr 16 18:30:39.282118 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:39.282010 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/79a42a16-0e76-463a-87f0-ca53a4f24aa2-kubelet-config\") pod \"global-pull-secret-syncer-cqphb\" (UID: \"79a42a16-0e76-463a-87f0-ca53a4f24aa2\") " pod="kube-system/global-pull-secret-syncer-cqphb" Apr 16 18:30:39.383259 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:39.383176 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/79a42a16-0e76-463a-87f0-ca53a4f24aa2-kubelet-config\") pod \"global-pull-secret-syncer-cqphb\" (UID: \"79a42a16-0e76-463a-87f0-ca53a4f24aa2\") " pod="kube-system/global-pull-secret-syncer-cqphb" Apr 16 18:30:39.383400 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:39.383265 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/79a42a16-0e76-463a-87f0-ca53a4f24aa2-dbus\") pod \"global-pull-secret-syncer-cqphb\" (UID: \"79a42a16-0e76-463a-87f0-ca53a4f24aa2\") " pod="kube-system/global-pull-secret-syncer-cqphb" Apr 16 18:30:39.383400 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:39.383282 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/79a42a16-0e76-463a-87f0-ca53a4f24aa2-original-pull-secret\") pod \"global-pull-secret-syncer-cqphb\" (UID: \"79a42a16-0e76-463a-87f0-ca53a4f24aa2\") " pod="kube-system/global-pull-secret-syncer-cqphb" Apr 16 18:30:39.383400 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:39.383305 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/79a42a16-0e76-463a-87f0-ca53a4f24aa2-kubelet-config\") pod \"global-pull-secret-syncer-cqphb\" (UID: \"79a42a16-0e76-463a-87f0-ca53a4f24aa2\") " pod="kube-system/global-pull-secret-syncer-cqphb" Apr 16 18:30:39.383537 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:39.383518 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/79a42a16-0e76-463a-87f0-ca53a4f24aa2-dbus\") pod \"global-pull-secret-syncer-cqphb\" (UID: \"79a42a16-0e76-463a-87f0-ca53a4f24aa2\") " pod="kube-system/global-pull-secret-syncer-cqphb" Apr 16 18:30:39.387414 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:39.387395 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/79a42a16-0e76-463a-87f0-ca53a4f24aa2-original-pull-secret\") pod \"global-pull-secret-syncer-cqphb\" (UID: \"79a42a16-0e76-463a-87f0-ca53a4f24aa2\") " pod="kube-system/global-pull-secret-syncer-cqphb" Apr 16 18:30:39.472493 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:39.472455 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cqphb" Apr 16 18:30:39.602745 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:39.602713 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cqphb"] Apr 16 18:30:39.605855 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:30:39.605828 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79a42a16_0e76_463a_87f0_ca53a4f24aa2.slice/crio-690eeff8a5872f4b3ad2ae5c80389e01521b94f1dca0b52bc73ea940a3fba043 WatchSource:0}: Error finding container 690eeff8a5872f4b3ad2ae5c80389e01521b94f1dca0b52bc73ea940a3fba043: Status 404 returned error can't find the container with id 690eeff8a5872f4b3ad2ae5c80389e01521b94f1dca0b52bc73ea940a3fba043 Apr 16 18:30:39.946744 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:39.946713 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cqphb" event={"ID":"79a42a16-0e76-463a-87f0-ca53a4f24aa2","Type":"ContainerStarted","Data":"690eeff8a5872f4b3ad2ae5c80389e01521b94f1dca0b52bc73ea940a3fba043"} Apr 16 18:30:41.398336 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:41.398290 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:30:41.398799 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:41.398359 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert\") pod \"ingress-canary-s87rb\" (UID: \"a25871f6-0ad2-44ac-9f9c-492a30345e0e\") " pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:30:41.398799 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:41.398470 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:30:41.398799 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:41.398552 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls podName:8529cef7-b4bb-4d9b-9a9d-cd0b821f2437 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:49.398531395 +0000 UTC m=+49.116388640 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls") pod "dns-default-lp728" (UID: "8529cef7-b4bb-4d9b-9a9d-cd0b821f2437") : secret "dns-default-metrics-tls" not found Apr 16 18:30:41.398799 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:41.398477 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:30:41.398799 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:41.398619 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert podName:a25871f6-0ad2-44ac-9f9c-492a30345e0e nodeName:}" failed. No retries permitted until 2026-04-16 18:30:49.398606349 +0000 UTC m=+49.116463588 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert") pod "ingress-canary-s87rb" (UID: "a25871f6-0ad2-44ac-9f9c-492a30345e0e") : secret "canary-serving-cert" not found Apr 16 18:30:43.956657 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:43.956620 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cqphb" event={"ID":"79a42a16-0e76-463a-87f0-ca53a4f24aa2","Type":"ContainerStarted","Data":"bed1460238246fe16ae8280abf2aefa7b1e1c1e1ca96f9360f23a82dbd807613"} Apr 16 18:30:43.972882 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:43.972839 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-cqphb" podStartSLOduration=1.274819883 podStartE2EDuration="4.972824315s" podCreationTimestamp="2026-04-16 18:30:39 +0000 UTC" firstStartedPulling="2026-04-16 18:30:39.607532301 +0000 UTC m=+39.325389541" lastFinishedPulling="2026-04-16 18:30:43.305536718 +0000 UTC m=+43.023393973" observedRunningTime="2026-04-16 18:30:43.972309975 +0000 UTC m=+43.690167250" watchObservedRunningTime="2026-04-16 18:30:43.972824315 +0000 UTC m=+43.690681622" Apr 16 18:30:49.457483 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:49.457433 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:30:49.457968 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:49.457505 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert\") pod \"ingress-canary-s87rb\" (UID: \"a25871f6-0ad2-44ac-9f9c-492a30345e0e\") " pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:30:49.457968 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:49.457592 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:30:49.457968 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:49.457615 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:30:49.457968 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:49.457673 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls podName:8529cef7-b4bb-4d9b-9a9d-cd0b821f2437 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:05.457657052 +0000 UTC m=+65.175514298 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls") pod "dns-default-lp728" (UID: "8529cef7-b4bb-4d9b-9a9d-cd0b821f2437") : secret "dns-default-metrics-tls" not found Apr 16 18:30:49.457968 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:30:49.457688 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert podName:a25871f6-0ad2-44ac-9f9c-492a30345e0e nodeName:}" failed. No retries permitted until 2026-04-16 18:31:05.457682286 +0000 UTC m=+65.175539526 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert") pod "ingress-canary-s87rb" (UID: "a25871f6-0ad2-44ac-9f9c-492a30345e0e") : secret "canary-serving-cert" not found Apr 16 18:30:59.924121 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:30:59.924093 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8t42k" Apr 16 18:31:05.467186 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:31:05.467132 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:31:05.467586 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:31:05.467208 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert\") pod \"ingress-canary-s87rb\" (UID: \"a25871f6-0ad2-44ac-9f9c-492a30345e0e\") " pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:31:05.467586 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:31:05.467300 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:05.467586 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:31:05.467363 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls podName:8529cef7-b4bb-4d9b-9a9d-cd0b821f2437 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:37.467346914 +0000 UTC m=+97.185204158 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls") pod "dns-default-lp728" (UID: "8529cef7-b4bb-4d9b-9a9d-cd0b821f2437") : secret "dns-default-metrics-tls" not found Apr 16 18:31:05.467586 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:31:05.467307 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:05.467586 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:31:05.467437 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert podName:a25871f6-0ad2-44ac-9f9c-492a30345e0e nodeName:}" failed. No retries permitted until 2026-04-16 18:31:37.467425008 +0000 UTC m=+97.185282249 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert") pod "ingress-canary-s87rb" (UID: "a25871f6-0ad2-44ac-9f9c-492a30345e0e") : secret "canary-serving-cert" not found Apr 16 18:31:06.474188 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:31:06.474146 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs\") pod \"network-metrics-daemon-kc2vf\" (UID: \"1d7d2281-07bb-4906-844c-f53fbfe57143\") " pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:31:06.474581 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:31:06.474249 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:31:06.474581 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:31:06.474310 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs podName:1d7d2281-07bb-4906-844c-f53fbfe57143 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:10.474296246 +0000 UTC m=+130.192153486 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs") pod "network-metrics-daemon-kc2vf" (UID: "1d7d2281-07bb-4906-844c-f53fbfe57143") : secret "metrics-daemon-secret" not found Apr 16 18:31:08.946371 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:31:08.946340 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vdcwk" Apr 16 18:31:37.477593 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:31:37.477544 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:31:37.477593 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:31:37.477603 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert\") pod \"ingress-canary-s87rb\" (UID: \"a25871f6-0ad2-44ac-9f9c-492a30345e0e\") " pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:31:37.478025 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:31:37.477683 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:37.478025 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:31:37.477689 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:37.478025 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:31:37.477734 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert podName:a25871f6-0ad2-44ac-9f9c-492a30345e0e nodeName:}" failed. No retries permitted until 2026-04-16 18:32:41.47772057 +0000 UTC m=+161.195577810 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert") pod "ingress-canary-s87rb" (UID: "a25871f6-0ad2-44ac-9f9c-492a30345e0e") : secret "canary-serving-cert" not found Apr 16 18:31:37.478025 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:31:37.477752 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls podName:8529cef7-b4bb-4d9b-9a9d-cd0b821f2437 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:41.477738922 +0000 UTC m=+161.195596163 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls") pod "dns-default-lp728" (UID: "8529cef7-b4bb-4d9b-9a9d-cd0b821f2437") : secret "dns-default-metrics-tls" not found Apr 16 18:32:10.504269 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:10.504222 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs\") pod \"network-metrics-daemon-kc2vf\" (UID: \"1d7d2281-07bb-4906-844c-f53fbfe57143\") " pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:32:10.504750 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:10.504371 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:32:10.504750 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:10.504442 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs podName:1d7d2281-07bb-4906-844c-f53fbfe57143 nodeName:}" failed. No retries permitted until 2026-04-16 18:34:12.504425626 +0000 UTC m=+252.222282869 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs") pod "network-metrics-daemon-kc2vf" (UID: "1d7d2281-07bb-4906-844c-f53fbfe57143") : secret "metrics-daemon-secret" not found Apr 16 18:32:25.073568 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.073532 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g"] Apr 16 18:32:25.075511 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.075496 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" Apr 16 18:32:25.078788 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.078765 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:32:25.081964 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.081942 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:32:25.082134 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.081951 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 18:32:25.082134 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.082043 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 18:32:25.082870 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.082850 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-bb86x\"" Apr 16 18:32:25.096367 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.096343 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g"] Apr 16 18:32:25.172130 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.172097 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd"] Apr 16 18:32:25.173892 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.173877 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd" Apr 16 18:32:25.177120 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.177099 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 18:32:25.177779 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.177755 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 18:32:25.177859 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.177815 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:32:25.177859 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.177763 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-ghlhk\"" Apr 16 18:32:25.188568 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.188544 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd"] Apr 16 18:32:25.208345 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.208307 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2b58bbe1-c23a-4746-87d2-0f22a039027a-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-24f2g\" (UID: \"2b58bbe1-c23a-4746-87d2-0f22a039027a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" Apr 16 18:32:25.208488 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.208364 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-24f2g\" (UID: \"2b58bbe1-c23a-4746-87d2-0f22a039027a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" Apr 16 18:32:25.208488 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.208411 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrwwn\" (UniqueName: \"kubernetes.io/projected/2b58bbe1-c23a-4746-87d2-0f22a039027a-kube-api-access-qrwwn\") pod \"cluster-monitoring-operator-6667474d89-24f2g\" (UID: \"2b58bbe1-c23a-4746-87d2-0f22a039027a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" Apr 16 18:32:25.309560 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.309524 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2b58bbe1-c23a-4746-87d2-0f22a039027a-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-24f2g\" (UID: \"2b58bbe1-c23a-4746-87d2-0f22a039027a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" Apr 16 18:32:25.309734 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.309579 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-24f2g\" (UID: \"2b58bbe1-c23a-4746-87d2-0f22a039027a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" Apr 16 18:32:25.309734 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.309615 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g47sk\" (UniqueName: \"kubernetes.io/projected/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-kube-api-access-g47sk\") pod \"cluster-samples-operator-667775844f-z9czd\" (UID: \"aceb2203-0f24-48b7-b1cd-8ea9833ed3f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd" Apr 16 18:32:25.309734 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.309645 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-z9czd\" (UID: \"aceb2203-0f24-48b7-b1cd-8ea9833ed3f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd" Apr 16 18:32:25.309734 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.309669 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrwwn\" (UniqueName: \"kubernetes.io/projected/2b58bbe1-c23a-4746-87d2-0f22a039027a-kube-api-access-qrwwn\") pod \"cluster-monitoring-operator-6667474d89-24f2g\" (UID: \"2b58bbe1-c23a-4746-87d2-0f22a039027a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" Apr 16 18:32:25.309863 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:25.309758 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:25.309863 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:25.309829 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls podName:2b58bbe1-c23a-4746-87d2-0f22a039027a nodeName:}" failed. No retries permitted until 2026-04-16 18:32:25.809812146 +0000 UTC m=+145.527669386 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-24f2g" (UID: "2b58bbe1-c23a-4746-87d2-0f22a039027a") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:25.310252 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.310234 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2b58bbe1-c23a-4746-87d2-0f22a039027a-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-24f2g\" (UID: \"2b58bbe1-c23a-4746-87d2-0f22a039027a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" Apr 16 18:32:25.326857 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.326804 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrwwn\" (UniqueName: \"kubernetes.io/projected/2b58bbe1-c23a-4746-87d2-0f22a039027a-kube-api-access-qrwwn\") pod \"cluster-monitoring-operator-6667474d89-24f2g\" (UID: \"2b58bbe1-c23a-4746-87d2-0f22a039027a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" Apr 16 18:32:25.410537 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.410506 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g47sk\" (UniqueName: \"kubernetes.io/projected/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-kube-api-access-g47sk\") pod \"cluster-samples-operator-667775844f-z9czd\" (UID: \"aceb2203-0f24-48b7-b1cd-8ea9833ed3f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd" Apr 16 18:32:25.410712 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.410550 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-z9czd\" (UID: \"aceb2203-0f24-48b7-b1cd-8ea9833ed3f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd" Apr 16 18:32:25.410712 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:25.410649 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:32:25.410712 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:25.410707 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-samples-operator-tls podName:aceb2203-0f24-48b7-b1cd-8ea9833ed3f9 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:25.910690668 +0000 UTC m=+145.628547912 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-samples-operator-tls") pod "cluster-samples-operator-667775844f-z9czd" (UID: "aceb2203-0f24-48b7-b1cd-8ea9833ed3f9") : secret "samples-operator-tls" not found Apr 16 18:32:25.421438 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.421408 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g47sk\" (UniqueName: \"kubernetes.io/projected/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-kube-api-access-g47sk\") pod \"cluster-samples-operator-667775844f-z9czd\" (UID: \"aceb2203-0f24-48b7-b1cd-8ea9833ed3f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd" Apr 16 18:32:25.813266 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.813180 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-24f2g\" (UID: \"2b58bbe1-c23a-4746-87d2-0f22a039027a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" Apr 16 18:32:25.813413 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:25.813316 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:25.813413 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:25.813379 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls podName:2b58bbe1-c23a-4746-87d2-0f22a039027a nodeName:}" failed. No retries permitted until 2026-04-16 18:32:26.813364664 +0000 UTC m=+146.531221908 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-24f2g" (UID: "2b58bbe1-c23a-4746-87d2-0f22a039027a") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:25.914395 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:25.914348 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-z9czd\" (UID: \"aceb2203-0f24-48b7-b1cd-8ea9833ed3f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd" Apr 16 18:32:25.914543 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:25.914521 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:32:25.914609 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:25.914602 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-samples-operator-tls podName:aceb2203-0f24-48b7-b1cd-8ea9833ed3f9 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:26.914581645 +0000 UTC m=+146.632438892 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-samples-operator-tls") pod "cluster-samples-operator-667775844f-z9czd" (UID: "aceb2203-0f24-48b7-b1cd-8ea9833ed3f9") : secret "samples-operator-tls" not found Apr 16 18:32:26.819802 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:26.819762 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-24f2g\" (UID: \"2b58bbe1-c23a-4746-87d2-0f22a039027a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" Apr 16 18:32:26.820352 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:26.819972 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:26.820352 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:26.820087 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls podName:2b58bbe1-c23a-4746-87d2-0f22a039027a nodeName:}" failed. No retries permitted until 2026-04-16 18:32:28.820066014 +0000 UTC m=+148.537923255 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-24f2g" (UID: "2b58bbe1-c23a-4746-87d2-0f22a039027a") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:26.921428 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:26.921374 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-z9czd\" (UID: \"aceb2203-0f24-48b7-b1cd-8ea9833ed3f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd" Apr 16 18:32:26.921637 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:26.921561 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:32:26.921637 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:26.921638 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-samples-operator-tls podName:aceb2203-0f24-48b7-b1cd-8ea9833ed3f9 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:28.921620606 +0000 UTC m=+148.639477864 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-samples-operator-tls") pod "cluster-samples-operator-667775844f-z9czd" (UID: "aceb2203-0f24-48b7-b1cd-8ea9833ed3f9") : secret "samples-operator-tls" not found Apr 16 18:32:28.835817 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:28.835778 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-24f2g\" (UID: \"2b58bbe1-c23a-4746-87d2-0f22a039027a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" Apr 16 18:32:28.836208 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:28.835957 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:28.836208 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:28.836036 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls podName:2b58bbe1-c23a-4746-87d2-0f22a039027a nodeName:}" failed. No retries permitted until 2026-04-16 18:32:32.836020552 +0000 UTC m=+152.553877791 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-24f2g" (UID: "2b58bbe1-c23a-4746-87d2-0f22a039027a") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:28.937157 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:28.937121 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-z9czd\" (UID: \"aceb2203-0f24-48b7-b1cd-8ea9833ed3f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd" Apr 16 18:32:28.937316 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:28.937260 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:32:28.937316 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:28.937323 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-samples-operator-tls podName:aceb2203-0f24-48b7-b1cd-8ea9833ed3f9 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:32.937305552 +0000 UTC m=+152.655162792 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-samples-operator-tls") pod "cluster-samples-operator-667775844f-z9czd" (UID: "aceb2203-0f24-48b7-b1cd-8ea9833ed3f9") : secret "samples-operator-tls" not found Apr 16 18:32:32.009468 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.009433 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-lfln6"] Apr 16 18:32:32.011480 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.011463 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-lfln6" Apr 16 18:32:32.037631 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.037609 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-2tpxh\"" Apr 16 18:32:32.088292 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.088261 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-lfln6"] Apr 16 18:32:32.159384 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.159353 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22ccz\" (UniqueName: \"kubernetes.io/projected/7869b7a3-ff6a-4bf3-a99e-c2dd0d14231a-kube-api-access-22ccz\") pod \"network-check-source-7b678d77c7-lfln6\" (UID: \"7869b7a3-ff6a-4bf3-a99e-c2dd0d14231a\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-lfln6" Apr 16 18:32:32.259970 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.259861 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22ccz\" (UniqueName: \"kubernetes.io/projected/7869b7a3-ff6a-4bf3-a99e-c2dd0d14231a-kube-api-access-22ccz\") pod \"network-check-source-7b678d77c7-lfln6\" (UID: \"7869b7a3-ff6a-4bf3-a99e-c2dd0d14231a\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-lfln6" Apr 16 18:32:32.269689 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.269661 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22ccz\" (UniqueName: \"kubernetes.io/projected/7869b7a3-ff6a-4bf3-a99e-c2dd0d14231a-kube-api-access-22ccz\") pod \"network-check-source-7b678d77c7-lfln6\" (UID: \"7869b7a3-ff6a-4bf3-a99e-c2dd0d14231a\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-lfln6" Apr 16 18:32:32.320186 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.320157 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-lfln6" Apr 16 18:32:32.432697 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.432656 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-lfln6"] Apr 16 18:32:32.435455 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:32:32.435425 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7869b7a3_ff6a_4bf3_a99e_c2dd0d14231a.slice/crio-c66b145c1cb1f3cd3c89ceae841178d27867333bb5e76d3d6ad2bcc8ae1652d1 WatchSource:0}: Error finding container c66b145c1cb1f3cd3c89ceae841178d27867333bb5e76d3d6ad2bcc8ae1652d1: Status 404 returned error can't find the container with id c66b145c1cb1f3cd3c89ceae841178d27867333bb5e76d3d6ad2bcc8ae1652d1 Apr 16 18:32:32.478097 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.478068 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8bf895f67-vrm6x"] Apr 16 18:32:32.480222 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.480206 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.484127 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.484105 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:32:32.484568 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.484549 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5jftk\"" Apr 16 18:32:32.484791 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.484751 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:32:32.484856 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.484808 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:32:32.493564 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.493542 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:32:32.499506 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.499483 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8bf895f67-vrm6x"] Apr 16 18:32:32.561052 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.560962 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-certificates\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.561052 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.561028 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-bound-sa-token\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.561052 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.561049 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.561308 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.561068 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-trusted-ca\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.561308 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.561084 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x5m2\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-kube-api-access-2x5m2\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.561308 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.561138 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-installation-pull-secrets\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.561308 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.561161 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-image-registry-private-configuration\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.561308 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.561178 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-ca-trust-extracted\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.661684 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.661635 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-bound-sa-token\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.661684 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.661683 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.661684 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.661703 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-trusted-ca\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.662007 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.661721 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2x5m2\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-kube-api-access-2x5m2\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.662007 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.661761 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-installation-pull-secrets\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.662007 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.661791 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-image-registry-private-configuration\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.662007 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:32.661805 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:32:32.662007 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:32.661829 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8bf895f67-vrm6x: secret "image-registry-tls" not found Apr 16 18:32:32.662007 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:32.661882 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls podName:77d682f3-5ff1-4f82-b33e-e8723e48e5f9 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:33.16186278 +0000 UTC m=+152.879720020 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls") pod "image-registry-8bf895f67-vrm6x" (UID: "77d682f3-5ff1-4f82-b33e-e8723e48e5f9") : secret "image-registry-tls" not found Apr 16 18:32:32.662007 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.661815 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-ca-trust-extracted\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.662360 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.662029 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-certificates\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.662360 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.662241 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-ca-trust-extracted\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.662707 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.662684 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-certificates\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.662808 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.662792 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-trusted-ca\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.664271 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.664252 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-installation-pull-secrets\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.664340 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.664299 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-image-registry-private-configuration\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.671649 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.671623 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-bound-sa-token\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.672559 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.672541 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x5m2\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-kube-api-access-2x5m2\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:32.774402 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.774374 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cljk9_3cdc1460-1781-45b3-ad12-0173537882af/dns-node-resolver/0.log" Apr 16 18:32:32.845880 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.845846 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-6p5zv"] Apr 16 18:32:32.848601 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.848586 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6p5zv" Apr 16 18:32:32.851421 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.851396 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 18:32:32.851545 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.851464 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-q52qx\"" Apr 16 18:32:32.851545 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.851475 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 18:32:32.857610 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.857583 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-6p5zv"] Apr 16 18:32:32.863784 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.863759 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-24f2g\" (UID: \"2b58bbe1-c23a-4746-87d2-0f22a039027a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" Apr 16 18:32:32.863907 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:32.863890 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:32.863973 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:32.863964 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls podName:2b58bbe1-c23a-4746-87d2-0f22a039027a nodeName:}" failed. No retries permitted until 2026-04-16 18:32:40.863950085 +0000 UTC m=+160.581807328 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-24f2g" (UID: "2b58bbe1-c23a-4746-87d2-0f22a039027a") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:32.964730 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.964690 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-z9czd\" (UID: \"aceb2203-0f24-48b7-b1cd-8ea9833ed3f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd" Apr 16 18:32:32.964897 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:32.964744 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5stn\" (UniqueName: \"kubernetes.io/projected/5e59f99d-1bfa-4b7d-96e7-66ff560447ba-kube-api-access-t5stn\") pod \"migrator-64d4d94569-6p5zv\" (UID: \"5e59f99d-1bfa-4b7d-96e7-66ff560447ba\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6p5zv" Apr 16 18:32:32.964897 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:32.964848 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:32:32.965000 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:32.964913 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-samples-operator-tls podName:aceb2203-0f24-48b7-b1cd-8ea9833ed3f9 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:40.964895708 +0000 UTC m=+160.682752948 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-samples-operator-tls") pod "cluster-samples-operator-667775844f-z9czd" (UID: "aceb2203-0f24-48b7-b1cd-8ea9833ed3f9") : secret "samples-operator-tls" not found Apr 16 18:32:33.065574 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:33.065536 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5stn\" (UniqueName: \"kubernetes.io/projected/5e59f99d-1bfa-4b7d-96e7-66ff560447ba-kube-api-access-t5stn\") pod \"migrator-64d4d94569-6p5zv\" (UID: \"5e59f99d-1bfa-4b7d-96e7-66ff560447ba\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6p5zv" Apr 16 18:32:33.075260 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:33.075232 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5stn\" (UniqueName: \"kubernetes.io/projected/5e59f99d-1bfa-4b7d-96e7-66ff560447ba-kube-api-access-t5stn\") pod \"migrator-64d4d94569-6p5zv\" (UID: \"5e59f99d-1bfa-4b7d-96e7-66ff560447ba\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6p5zv" Apr 16 18:32:33.158002 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:33.157935 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6p5zv" Apr 16 18:32:33.163330 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:33.163304 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-lfln6" event={"ID":"7869b7a3-ff6a-4bf3-a99e-c2dd0d14231a","Type":"ContainerStarted","Data":"523eab7bcfa21edf8d706d6f824944caa45c4828aff2871b3bbbacded3c7c955"} Apr 16 18:32:33.163422 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:33.163337 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-lfln6" event={"ID":"7869b7a3-ff6a-4bf3-a99e-c2dd0d14231a","Type":"ContainerStarted","Data":"c66b145c1cb1f3cd3c89ceae841178d27867333bb5e76d3d6ad2bcc8ae1652d1"} Apr 16 18:32:33.165834 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:33.165805 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:33.165980 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:33.165965 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:32:33.166028 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:33.165981 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8bf895f67-vrm6x: secret "image-registry-tls" not found Apr 16 18:32:33.166062 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:33.166036 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls podName:77d682f3-5ff1-4f82-b33e-e8723e48e5f9 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:34.166020976 +0000 UTC m=+153.883878234 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls") pod "image-registry-8bf895f67-vrm6x" (UID: "77d682f3-5ff1-4f82-b33e-e8723e48e5f9") : secret "image-registry-tls" not found Apr 16 18:32:33.182625 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:33.182584 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-lfln6" podStartSLOduration=2.182569358 podStartE2EDuration="2.182569358s" podCreationTimestamp="2026-04-16 18:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:32:33.182218577 +0000 UTC m=+152.900075840" watchObservedRunningTime="2026-04-16 18:32:33.182569358 +0000 UTC m=+152.900426620" Apr 16 18:32:33.278371 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:33.278318 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-6p5zv"] Apr 16 18:32:33.282759 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:32:33.282729 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e59f99d_1bfa_4b7d_96e7_66ff560447ba.slice/crio-c33c7ba35bd8c2e6e6464f765475cf835f9a0d284feadafffedd2871b108c2f8 WatchSource:0}: Error finding container c33c7ba35bd8c2e6e6464f765475cf835f9a0d284feadafffedd2871b108c2f8: Status 404 returned error can't find the container with id c33c7ba35bd8c2e6e6464f765475cf835f9a0d284feadafffedd2871b108c2f8 Apr 16 18:32:33.360592 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:33.360564 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2z4jk_9521e1df-4c34-4a19-bce1-983c6712cca8/node-ca/0.log" Apr 16 18:32:34.166187 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:34.166148 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6p5zv" event={"ID":"5e59f99d-1bfa-4b7d-96e7-66ff560447ba","Type":"ContainerStarted","Data":"c33c7ba35bd8c2e6e6464f765475cf835f9a0d284feadafffedd2871b108c2f8"} Apr 16 18:32:34.176746 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:34.176722 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:34.176899 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:34.176870 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:32:34.176899 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:34.176886 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8bf895f67-vrm6x: secret "image-registry-tls" not found Apr 16 18:32:34.177020 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:34.176955 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls podName:77d682f3-5ff1-4f82-b33e-e8723e48e5f9 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:36.176940175 +0000 UTC m=+155.894797414 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls") pod "image-registry-8bf895f67-vrm6x" (UID: "77d682f3-5ff1-4f82-b33e-e8723e48e5f9") : secret "image-registry-tls" not found Apr 16 18:32:35.169759 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:35.169722 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6p5zv" event={"ID":"5e59f99d-1bfa-4b7d-96e7-66ff560447ba","Type":"ContainerStarted","Data":"632a92f874c736db021ec3a3abd314e0653a11f86b91ecaea6966ddc3d7e80fb"} Apr 16 18:32:35.169759 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:35.169763 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6p5zv" event={"ID":"5e59f99d-1bfa-4b7d-96e7-66ff560447ba","Type":"ContainerStarted","Data":"5c350c5ba5198c40cfa556d58aaa8ac7d0863ec6eabff96116d47046d653bc3f"} Apr 16 18:32:35.196310 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:35.196257 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-6p5zv" podStartSLOduration=2.179275452 podStartE2EDuration="3.196237773s" podCreationTimestamp="2026-04-16 18:32:32 +0000 UTC" firstStartedPulling="2026-04-16 18:32:33.2851019 +0000 UTC m=+153.002959143" lastFinishedPulling="2026-04-16 18:32:34.302064221 +0000 UTC m=+154.019921464" observedRunningTime="2026-04-16 18:32:35.194637934 +0000 UTC m=+154.912495208" watchObservedRunningTime="2026-04-16 18:32:35.196237773 +0000 UTC m=+154.914095037" Apr 16 18:32:36.194230 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:36.194180 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:36.194630 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:36.194326 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:32:36.194630 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:36.194343 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8bf895f67-vrm6x: secret "image-registry-tls" not found Apr 16 18:32:36.194630 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:36.194394 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls podName:77d682f3-5ff1-4f82-b33e-e8723e48e5f9 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:40.194380737 +0000 UTC m=+159.912237978 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls") pod "image-registry-8bf895f67-vrm6x" (UID: "77d682f3-5ff1-4f82-b33e-e8723e48e5f9") : secret "image-registry-tls" not found Apr 16 18:32:36.686538 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:36.686487 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-lp728" podUID="8529cef7-b4bb-4d9b-9a9d-cd0b821f2437" Apr 16 18:32:36.692733 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:36.692704 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-s87rb" podUID="a25871f6-0ad2-44ac-9f9c-492a30345e0e" Apr 16 18:32:36.787040 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:36.787002 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-kc2vf" podUID="1d7d2281-07bb-4906-844c-f53fbfe57143" Apr 16 18:32:37.173620 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:37.173591 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:32:37.173773 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:37.173592 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lp728" Apr 16 18:32:40.227345 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:40.227306 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:40.227747 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:40.227457 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:32:40.227747 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:40.227475 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8bf895f67-vrm6x: secret "image-registry-tls" not found Apr 16 18:32:40.227747 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:40.227528 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls podName:77d682f3-5ff1-4f82-b33e-e8723e48e5f9 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:48.22751375 +0000 UTC m=+167.945370989 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls") pod "image-registry-8bf895f67-vrm6x" (UID: "77d682f3-5ff1-4f82-b33e-e8723e48e5f9") : secret "image-registry-tls" not found Apr 16 18:32:40.931780 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:40.931728 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-24f2g\" (UID: \"2b58bbe1-c23a-4746-87d2-0f22a039027a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" Apr 16 18:32:40.932014 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:40.931888 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:40.932014 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:40.931981 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls podName:2b58bbe1-c23a-4746-87d2-0f22a039027a nodeName:}" failed. No retries permitted until 2026-04-16 18:32:56.93196262 +0000 UTC m=+176.649819879 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-24f2g" (UID: "2b58bbe1-c23a-4746-87d2-0f22a039027a") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:32:41.033174 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:41.033144 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-z9czd\" (UID: \"aceb2203-0f24-48b7-b1cd-8ea9833ed3f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd" Apr 16 18:32:41.035984 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:41.035956 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aceb2203-0f24-48b7-b1cd-8ea9833ed3f9-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-z9czd\" (UID: \"aceb2203-0f24-48b7-b1cd-8ea9833ed3f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd" Apr 16 18:32:41.082132 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:41.082095 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd" Apr 16 18:32:41.198106 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:41.198033 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd"] Apr 16 18:32:41.538182 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:41.538097 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:32:41.538182 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:41.538160 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert\") pod \"ingress-canary-s87rb\" (UID: \"a25871f6-0ad2-44ac-9f9c-492a30345e0e\") " pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:32:41.538554 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:41.538251 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:32:41.538554 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:41.538307 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls podName:8529cef7-b4bb-4d9b-9a9d-cd0b821f2437 nodeName:}" failed. No retries permitted until 2026-04-16 18:34:43.538293703 +0000 UTC m=+283.256150947 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls") pod "dns-default-lp728" (UID: "8529cef7-b4bb-4d9b-9a9d-cd0b821f2437") : secret "dns-default-metrics-tls" not found Apr 16 18:32:41.538554 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:41.538309 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:32:41.538554 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:41.538358 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert podName:a25871f6-0ad2-44ac-9f9c-492a30345e0e nodeName:}" failed. No retries permitted until 2026-04-16 18:34:43.538344617 +0000 UTC m=+283.256201862 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert") pod "ingress-canary-s87rb" (UID: "a25871f6-0ad2-44ac-9f9c-492a30345e0e") : secret "canary-serving-cert" not found Apr 16 18:32:42.183877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:42.183844 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd" event={"ID":"aceb2203-0f24-48b7-b1cd-8ea9833ed3f9","Type":"ContainerStarted","Data":"689dd9e727cbaad1804edda7431515404cf8405a20e32f8aa0d10381427f09b0"} Apr 16 18:32:43.188487 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:43.188452 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd" event={"ID":"aceb2203-0f24-48b7-b1cd-8ea9833ed3f9","Type":"ContainerStarted","Data":"71a150d44fa57d9c9cc3a41e98f664dca4645984c1d67952287a036477efedf5"} Apr 16 18:32:43.188487 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:43.188487 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd" event={"ID":"aceb2203-0f24-48b7-b1cd-8ea9833ed3f9","Type":"ContainerStarted","Data":"8cd9a8a870e819f9a4fdd7b1993517b2dacabacc3520bcb4f5dae29fd6c1acba"} Apr 16 18:32:43.205282 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:43.205235 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-z9czd" podStartSLOduration=16.710399633 podStartE2EDuration="18.205219983s" podCreationTimestamp="2026-04-16 18:32:25 +0000 UTC" firstStartedPulling="2026-04-16 18:32:41.236208383 +0000 UTC m=+160.954065624" lastFinishedPulling="2026-04-16 18:32:42.731028733 +0000 UTC m=+162.448885974" observedRunningTime="2026-04-16 18:32:43.204951827 +0000 UTC m=+162.922809091" watchObservedRunningTime="2026-04-16 18:32:43.205219983 +0000 UTC m=+162.923077244" Apr 16 18:32:48.290003 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:48.289951 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:48.292390 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:48.292363 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls\") pod \"image-registry-8bf895f67-vrm6x\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:48.388714 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:48.388670 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:48.505057 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:48.505023 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8bf895f67-vrm6x"] Apr 16 18:32:48.507812 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:32:48.507788 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77d682f3_5ff1_4f82_b33e_e8723e48e5f9.slice/crio-fc45b1a4953d8e046aa3acc064fcdd8f2b569bd25feb323c9a83897841914bcd WatchSource:0}: Error finding container fc45b1a4953d8e046aa3acc064fcdd8f2b569bd25feb323c9a83897841914bcd: Status 404 returned error can't find the container with id fc45b1a4953d8e046aa3acc064fcdd8f2b569bd25feb323c9a83897841914bcd Apr 16 18:32:49.207410 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:49.207376 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" event={"ID":"77d682f3-5ff1-4f82-b33e-e8723e48e5f9","Type":"ContainerStarted","Data":"f0977c9b484d1a318a93497f8bb19c092948b865a2ada0066c87d2b7076362be"} Apr 16 18:32:49.207410 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:49.207410 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" event={"ID":"77d682f3-5ff1-4f82-b33e-e8723e48e5f9","Type":"ContainerStarted","Data":"fc45b1a4953d8e046aa3acc064fcdd8f2b569bd25feb323c9a83897841914bcd"} Apr 16 18:32:49.207631 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:49.207580 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:32:49.229982 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:49.229939 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" podStartSLOduration=17.229904711 podStartE2EDuration="17.229904711s" podCreationTimestamp="2026-04-16 18:32:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:32:49.228950736 +0000 UTC m=+168.946807995" watchObservedRunningTime="2026-04-16 18:32:49.229904711 +0000 UTC m=+168.947761972" Apr 16 18:32:49.770912 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:49.770883 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:32:56.956005 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:56.955960 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-24f2g\" (UID: \"2b58bbe1-c23a-4746-87d2-0f22a039027a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" Apr 16 18:32:56.958288 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:56.958266 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2b58bbe1-c23a-4746-87d2-0f22a039027a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-24f2g\" (UID: \"2b58bbe1-c23a-4746-87d2-0f22a039027a\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" Apr 16 18:32:57.184463 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:57.184420 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" Apr 16 18:32:57.312154 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:57.312123 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g"] Apr 16 18:32:57.316341 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:32:57.316312 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b58bbe1_c23a_4746_87d2_0f22a039027a.slice/crio-d1e9d181a66a33eb266ab5794f1dc0b08189a49b28b8ca9d49fd237976ded5a8 WatchSource:0}: Error finding container d1e9d181a66a33eb266ab5794f1dc0b08189a49b28b8ca9d49fd237976ded5a8: Status 404 returned error can't find the container with id d1e9d181a66a33eb266ab5794f1dc0b08189a49b28b8ca9d49fd237976ded5a8 Apr 16 18:32:58.230844 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.230804 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" event={"ID":"2b58bbe1-c23a-4746-87d2-0f22a039027a","Type":"ContainerStarted","Data":"d1e9d181a66a33eb266ab5794f1dc0b08189a49b28b8ca9d49fd237976ded5a8"} Apr 16 18:32:58.420684 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.420652 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-kwg6q"] Apr 16 18:32:58.425680 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.425655 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kwg6q" Apr 16 18:32:58.428634 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.428604 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-ppqlw\"" Apr 16 18:32:58.428755 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.428654 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:32:58.428755 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.428669 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:32:58.430115 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.430096 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:32:58.430241 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.430133 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:32:58.435760 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.435740 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kwg6q"] Apr 16 18:32:58.465504 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.465478 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/26032924-872b-462e-9763-af466c1929a8-crio-socket\") pod \"insights-runtime-extractor-kwg6q\" (UID: \"26032924-872b-462e-9763-af466c1929a8\") " pod="openshift-insights/insights-runtime-extractor-kwg6q" Apr 16 18:32:58.465669 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.465558 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/26032924-872b-462e-9763-af466c1929a8-data-volume\") pod \"insights-runtime-extractor-kwg6q\" (UID: \"26032924-872b-462e-9763-af466c1929a8\") " pod="openshift-insights/insights-runtime-extractor-kwg6q" Apr 16 18:32:58.465669 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.465594 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/26032924-872b-462e-9763-af466c1929a8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kwg6q\" (UID: \"26032924-872b-462e-9763-af466c1929a8\") " pod="openshift-insights/insights-runtime-extractor-kwg6q" Apr 16 18:32:58.465669 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.465619 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gl56\" (UniqueName: \"kubernetes.io/projected/26032924-872b-462e-9763-af466c1929a8-kube-api-access-8gl56\") pod \"insights-runtime-extractor-kwg6q\" (UID: \"26032924-872b-462e-9763-af466c1929a8\") " pod="openshift-insights/insights-runtime-extractor-kwg6q" Apr 16 18:32:58.465669 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.465649 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/26032924-872b-462e-9763-af466c1929a8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kwg6q\" (UID: \"26032924-872b-462e-9763-af466c1929a8\") " pod="openshift-insights/insights-runtime-extractor-kwg6q" Apr 16 18:32:58.484280 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.484212 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8bf895f67-vrm6x"] Apr 16 18:32:58.566696 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.566665 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/26032924-872b-462e-9763-af466c1929a8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kwg6q\" (UID: \"26032924-872b-462e-9763-af466c1929a8\") " pod="openshift-insights/insights-runtime-extractor-kwg6q" Apr 16 18:32:58.566696 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.566695 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gl56\" (UniqueName: \"kubernetes.io/projected/26032924-872b-462e-9763-af466c1929a8-kube-api-access-8gl56\") pod \"insights-runtime-extractor-kwg6q\" (UID: \"26032924-872b-462e-9763-af466c1929a8\") " pod="openshift-insights/insights-runtime-extractor-kwg6q" Apr 16 18:32:58.566943 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.566718 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/26032924-872b-462e-9763-af466c1929a8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kwg6q\" (UID: \"26032924-872b-462e-9763-af466c1929a8\") " pod="openshift-insights/insights-runtime-extractor-kwg6q" Apr 16 18:32:58.566943 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.566753 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/26032924-872b-462e-9763-af466c1929a8-crio-socket\") pod \"insights-runtime-extractor-kwg6q\" (UID: \"26032924-872b-462e-9763-af466c1929a8\") " pod="openshift-insights/insights-runtime-extractor-kwg6q" Apr 16 18:32:58.566943 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.566821 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/26032924-872b-462e-9763-af466c1929a8-data-volume\") pod \"insights-runtime-extractor-kwg6q\" (UID: \"26032924-872b-462e-9763-af466c1929a8\") " pod="openshift-insights/insights-runtime-extractor-kwg6q" Apr 16 18:32:58.566943 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.566900 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/26032924-872b-462e-9763-af466c1929a8-crio-socket\") pod \"insights-runtime-extractor-kwg6q\" (UID: \"26032924-872b-462e-9763-af466c1929a8\") " pod="openshift-insights/insights-runtime-extractor-kwg6q" Apr 16 18:32:58.567153 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.567133 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/26032924-872b-462e-9763-af466c1929a8-data-volume\") pod \"insights-runtime-extractor-kwg6q\" (UID: \"26032924-872b-462e-9763-af466c1929a8\") " pod="openshift-insights/insights-runtime-extractor-kwg6q" Apr 16 18:32:58.567323 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.567299 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/26032924-872b-462e-9763-af466c1929a8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kwg6q\" (UID: \"26032924-872b-462e-9763-af466c1929a8\") " pod="openshift-insights/insights-runtime-extractor-kwg6q" Apr 16 18:32:58.569394 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.569374 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/26032924-872b-462e-9763-af466c1929a8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kwg6q\" (UID: \"26032924-872b-462e-9763-af466c1929a8\") " pod="openshift-insights/insights-runtime-extractor-kwg6q" Apr 16 18:32:58.584070 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.584044 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gl56\" (UniqueName: \"kubernetes.io/projected/26032924-872b-462e-9763-af466c1929a8-kube-api-access-8gl56\") pod \"insights-runtime-extractor-kwg6q\" (UID: \"26032924-872b-462e-9763-af466c1929a8\") " pod="openshift-insights/insights-runtime-extractor-kwg6q" Apr 16 18:32:58.737395 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.737304 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kwg6q" Apr 16 18:32:58.909214 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:58.909041 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kwg6q"] Apr 16 18:32:58.912134 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:32:58.912113 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26032924_872b_462e_9763_af466c1929a8.slice/crio-e91d7d39722cf43b1462c86b4b71095ec8790d11249f8c968e7c56801c2dff19 WatchSource:0}: Error finding container e91d7d39722cf43b1462c86b4b71095ec8790d11249f8c968e7c56801c2dff19: Status 404 returned error can't find the container with id e91d7d39722cf43b1462c86b4b71095ec8790d11249f8c968e7c56801c2dff19 Apr 16 18:32:59.235086 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:59.235047 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" event={"ID":"2b58bbe1-c23a-4746-87d2-0f22a039027a","Type":"ContainerStarted","Data":"b1c81d62430399dafd63ae1391d750f7f64b0e6c170724a7cfebe702b9bc2a7d"} Apr 16 18:32:59.236394 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:59.236372 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kwg6q" event={"ID":"26032924-872b-462e-9763-af466c1929a8","Type":"ContainerStarted","Data":"4e22bd1042e2a8411252b07d53adff1fd534ed8b91b843e4aa71ad3c6583c89b"} Apr 16 18:32:59.236491 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:59.236402 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kwg6q" event={"ID":"26032924-872b-462e-9763-af466c1929a8","Type":"ContainerStarted","Data":"e91d7d39722cf43b1462c86b4b71095ec8790d11249f8c968e7c56801c2dff19"} Apr 16 18:32:59.259775 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:59.259732 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-24f2g" podStartSLOduration=32.742622584 podStartE2EDuration="34.25971834s" podCreationTimestamp="2026-04-16 18:32:25 +0000 UTC" firstStartedPulling="2026-04-16 18:32:57.318178992 +0000 UTC m=+177.036036238" lastFinishedPulling="2026-04-16 18:32:58.83527475 +0000 UTC m=+178.553131994" observedRunningTime="2026-04-16 18:32:59.257895666 +0000 UTC m=+178.975752927" watchObservedRunningTime="2026-04-16 18:32:59.25971834 +0000 UTC m=+178.977575602" Apr 16 18:32:59.442450 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:59.442416 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9qvvd"] Apr 16 18:32:59.445967 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:59.445947 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9qvvd" Apr 16 18:32:59.449462 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:59.449440 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 18:32:59.450359 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:59.450339 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-ntkht\"" Apr 16 18:32:59.456049 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:59.455454 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9qvvd"] Apr 16 18:32:59.472734 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:59.472713 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/80b150cc-effc-4cfb-a442-ad71bfb82365-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-9qvvd\" (UID: \"80b150cc-effc-4cfb-a442-ad71bfb82365\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9qvvd" Apr 16 18:32:59.573745 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:32:59.573718 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/80b150cc-effc-4cfb-a442-ad71bfb82365-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-9qvvd\" (UID: \"80b150cc-effc-4cfb-a442-ad71bfb82365\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9qvvd" Apr 16 18:32:59.573879 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:59.573861 2570 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 18:32:59.573967 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:32:59.573945 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80b150cc-effc-4cfb-a442-ad71bfb82365-tls-certificates podName:80b150cc-effc-4cfb-a442-ad71bfb82365 nodeName:}" failed. No retries permitted until 2026-04-16 18:33:00.073908258 +0000 UTC m=+179.791765502 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/80b150cc-effc-4cfb-a442-ad71bfb82365-tls-certificates") pod "prometheus-operator-admission-webhook-9cb97cd87-9qvvd" (UID: "80b150cc-effc-4cfb-a442-ad71bfb82365") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 18:33:00.077385 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:00.077346 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/80b150cc-effc-4cfb-a442-ad71bfb82365-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-9qvvd\" (UID: \"80b150cc-effc-4cfb-a442-ad71bfb82365\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9qvvd" Apr 16 18:33:00.080124 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:00.080099 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/80b150cc-effc-4cfb-a442-ad71bfb82365-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-9qvvd\" (UID: \"80b150cc-effc-4cfb-a442-ad71bfb82365\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9qvvd" Apr 16 18:33:00.241289 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:00.241254 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kwg6q" event={"ID":"26032924-872b-462e-9763-af466c1929a8","Type":"ContainerStarted","Data":"b95eca7a90d681ac064683fdfa90d48d8eeb2fd2d05e36811d00876844315b4b"} Apr 16 18:33:00.357958 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:00.357855 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9qvvd" Apr 16 18:33:00.493886 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:00.493855 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9qvvd"] Apr 16 18:33:00.497677 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:33:00.497649 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80b150cc_effc_4cfb_a442_ad71bfb82365.slice/crio-42d120b7dcbd39caecf0db0decd8d07b84cb3162febb4b0cb9105db677d506cc WatchSource:0}: Error finding container 42d120b7dcbd39caecf0db0decd8d07b84cb3162febb4b0cb9105db677d506cc: Status 404 returned error can't find the container with id 42d120b7dcbd39caecf0db0decd8d07b84cb3162febb4b0cb9105db677d506cc Apr 16 18:33:01.245067 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:01.244968 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9qvvd" event={"ID":"80b150cc-effc-4cfb-a442-ad71bfb82365","Type":"ContainerStarted","Data":"42d120b7dcbd39caecf0db0decd8d07b84cb3162febb4b0cb9105db677d506cc"} Apr 16 18:33:01.246592 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:01.246569 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kwg6q" event={"ID":"26032924-872b-462e-9763-af466c1929a8","Type":"ContainerStarted","Data":"c821043ac939861f516715ec9a881023721635bf776aa55dded5bb20ec9b8c4b"} Apr 16 18:33:01.271699 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:01.271645 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-kwg6q" podStartSLOduration=1.2734254 podStartE2EDuration="3.271628253s" podCreationTimestamp="2026-04-16 18:32:58 +0000 UTC" firstStartedPulling="2026-04-16 18:32:58.96873789 +0000 UTC m=+178.686595132" lastFinishedPulling="2026-04-16 18:33:00.966940731 +0000 UTC m=+180.684797985" observedRunningTime="2026-04-16 18:33:01.270897723 +0000 UTC m=+180.988754986" watchObservedRunningTime="2026-04-16 18:33:01.271628253 +0000 UTC m=+180.989485517" Apr 16 18:33:02.250080 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.250039 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9qvvd" event={"ID":"80b150cc-effc-4cfb-a442-ad71bfb82365","Type":"ContainerStarted","Data":"24a2a4fe27bfc925a6ac72c9806e271b854de57f9f3fab32d1e66048618de051"} Apr 16 18:33:02.250544 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.250434 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9qvvd" Apr 16 18:33:02.254849 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.254828 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9qvvd" Apr 16 18:33:02.266354 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.266311 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9qvvd" podStartSLOduration=2.116559115 podStartE2EDuration="3.266300174s" podCreationTimestamp="2026-04-16 18:32:59 +0000 UTC" firstStartedPulling="2026-04-16 18:33:00.500035804 +0000 UTC m=+180.217893044" lastFinishedPulling="2026-04-16 18:33:01.649776859 +0000 UTC m=+181.367634103" observedRunningTime="2026-04-16 18:33:02.265832763 +0000 UTC m=+181.983690024" watchObservedRunningTime="2026-04-16 18:33:02.266300174 +0000 UTC m=+181.984157430" Apr 16 18:33:02.504373 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.504294 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-s48sj"] Apr 16 18:33:02.507453 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.507437 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-s48sj" Apr 16 18:33:02.510298 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.510272 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 18:33:02.510427 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.510307 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 18:33:02.510483 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.510466 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-d7wng\"" Apr 16 18:33:02.511675 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.511661 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:33:02.522360 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.522339 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-s48sj"] Apr 16 18:33:02.593881 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.593844 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/34e75ef8-cf4c-4fcc-8522-9700e612bd1a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-s48sj\" (UID: \"34e75ef8-cf4c-4fcc-8522-9700e612bd1a\") " pod="openshift-monitoring/prometheus-operator-78f957474d-s48sj" Apr 16 18:33:02.594058 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.593940 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpnwq\" (UniqueName: \"kubernetes.io/projected/34e75ef8-cf4c-4fcc-8522-9700e612bd1a-kube-api-access-gpnwq\") pod \"prometheus-operator-78f957474d-s48sj\" (UID: \"34e75ef8-cf4c-4fcc-8522-9700e612bd1a\") " pod="openshift-monitoring/prometheus-operator-78f957474d-s48sj" Apr 16 18:33:02.594058 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.593970 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34e75ef8-cf4c-4fcc-8522-9700e612bd1a-metrics-client-ca\") pod \"prometheus-operator-78f957474d-s48sj\" (UID: \"34e75ef8-cf4c-4fcc-8522-9700e612bd1a\") " pod="openshift-monitoring/prometheus-operator-78f957474d-s48sj" Apr 16 18:33:02.594058 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.593995 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/34e75ef8-cf4c-4fcc-8522-9700e612bd1a-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-s48sj\" (UID: \"34e75ef8-cf4c-4fcc-8522-9700e612bd1a\") " pod="openshift-monitoring/prometheus-operator-78f957474d-s48sj" Apr 16 18:33:02.695140 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.695102 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpnwq\" (UniqueName: \"kubernetes.io/projected/34e75ef8-cf4c-4fcc-8522-9700e612bd1a-kube-api-access-gpnwq\") pod \"prometheus-operator-78f957474d-s48sj\" (UID: \"34e75ef8-cf4c-4fcc-8522-9700e612bd1a\") " pod="openshift-monitoring/prometheus-operator-78f957474d-s48sj" Apr 16 18:33:02.695140 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.695144 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34e75ef8-cf4c-4fcc-8522-9700e612bd1a-metrics-client-ca\") pod \"prometheus-operator-78f957474d-s48sj\" (UID: \"34e75ef8-cf4c-4fcc-8522-9700e612bd1a\") " pod="openshift-monitoring/prometheus-operator-78f957474d-s48sj" Apr 16 18:33:02.695373 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.695189 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/34e75ef8-cf4c-4fcc-8522-9700e612bd1a-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-s48sj\" (UID: \"34e75ef8-cf4c-4fcc-8522-9700e612bd1a\") " pod="openshift-monitoring/prometheus-operator-78f957474d-s48sj" Apr 16 18:33:02.695373 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.695248 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/34e75ef8-cf4c-4fcc-8522-9700e612bd1a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-s48sj\" (UID: \"34e75ef8-cf4c-4fcc-8522-9700e612bd1a\") " pod="openshift-monitoring/prometheus-operator-78f957474d-s48sj" Apr 16 18:33:02.696063 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.696041 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/34e75ef8-cf4c-4fcc-8522-9700e612bd1a-metrics-client-ca\") pod \"prometheus-operator-78f957474d-s48sj\" (UID: \"34e75ef8-cf4c-4fcc-8522-9700e612bd1a\") " pod="openshift-monitoring/prometheus-operator-78f957474d-s48sj" Apr 16 18:33:02.697719 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.697694 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/34e75ef8-cf4c-4fcc-8522-9700e612bd1a-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-s48sj\" (UID: \"34e75ef8-cf4c-4fcc-8522-9700e612bd1a\") " pod="openshift-monitoring/prometheus-operator-78f957474d-s48sj" Apr 16 18:33:02.697813 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.697796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/34e75ef8-cf4c-4fcc-8522-9700e612bd1a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-s48sj\" (UID: \"34e75ef8-cf4c-4fcc-8522-9700e612bd1a\") " pod="openshift-monitoring/prometheus-operator-78f957474d-s48sj" Apr 16 18:33:02.704778 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.704754 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpnwq\" (UniqueName: \"kubernetes.io/projected/34e75ef8-cf4c-4fcc-8522-9700e612bd1a-kube-api-access-gpnwq\") pod \"prometheus-operator-78f957474d-s48sj\" (UID: \"34e75ef8-cf4c-4fcc-8522-9700e612bd1a\") " pod="openshift-monitoring/prometheus-operator-78f957474d-s48sj" Apr 16 18:33:02.816226 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.816128 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-s48sj" Apr 16 18:33:02.931430 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:02.931399 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-s48sj"] Apr 16 18:33:02.934588 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:33:02.934558 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34e75ef8_cf4c_4fcc_8522_9700e612bd1a.slice/crio-6d205ad90cfbbc283871d9c914d6654502f3882e0bedccd2bfd4d998093c54f1 WatchSource:0}: Error finding container 6d205ad90cfbbc283871d9c914d6654502f3882e0bedccd2bfd4d998093c54f1: Status 404 returned error can't find the container with id 6d205ad90cfbbc283871d9c914d6654502f3882e0bedccd2bfd4d998093c54f1 Apr 16 18:33:03.254313 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:03.254272 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-s48sj" event={"ID":"34e75ef8-cf4c-4fcc-8522-9700e612bd1a","Type":"ContainerStarted","Data":"6d205ad90cfbbc283871d9c914d6654502f3882e0bedccd2bfd4d998093c54f1"} Apr 16 18:33:04.259820 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:04.259749 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-s48sj" event={"ID":"34e75ef8-cf4c-4fcc-8522-9700e612bd1a","Type":"ContainerStarted","Data":"3b37ed3ce96bb1e74adf2ed00ebe4614f4cfef63ee4d1a7994b15964e568aff9"} Apr 16 18:33:04.259820 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:04.259794 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-s48sj" event={"ID":"34e75ef8-cf4c-4fcc-8522-9700e612bd1a","Type":"ContainerStarted","Data":"4d0a5a5c48a88b8217769960d5608fea7ae152fbc2f17b2602affc30ad71541a"} Apr 16 18:33:04.279863 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:04.279808 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-s48sj" podStartSLOduration=1.144416391 podStartE2EDuration="2.279788138s" podCreationTimestamp="2026-04-16 18:33:02 +0000 UTC" firstStartedPulling="2026-04-16 18:33:02.936437695 +0000 UTC m=+182.654294938" lastFinishedPulling="2026-04-16 18:33:04.071809445 +0000 UTC m=+183.789666685" observedRunningTime="2026-04-16 18:33:04.278316327 +0000 UTC m=+183.996173613" watchObservedRunningTime="2026-04-16 18:33:04.279788138 +0000 UTC m=+183.997645400" Apr 16 18:33:05.863937 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.863894 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc"] Apr 16 18:33:05.867289 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.867267 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-5sb9j"] Apr 16 18:33:05.867446 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.867431 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" Apr 16 18:33:05.870167 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.870146 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 18:33:05.870293 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.870218 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:33:05.870458 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.870442 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-qnhd8\"" Apr 16 18:33:05.870901 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.870881 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:05.873288 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.873241 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-rxsj5\"" Apr 16 18:33:05.873463 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.873429 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:33:05.873598 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.873578 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:33:05.873719 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.873616 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:33:05.878962 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.878945 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc"] Apr 16 18:33:05.892403 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.892380 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-6lb8r"] Apr 16 18:33:05.895602 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.895584 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:05.898208 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.898190 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 18:33:05.898343 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.898322 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-bblwd\"" Apr 16 18:33:05.898414 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.898335 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:33:05.899697 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.899683 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 18:33:05.917678 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.917654 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-6lb8r"] Apr 16 18:33:05.921854 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.921828 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:05.921969 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.921886 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-node-exporter-wtmp\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:05.921969 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.921914 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-node-exporter-accelerators-collector-config\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:05.922047 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.921972 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/953954e4-b723-467d-afcb-ae1b158de42e-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-6lb8r\" (UID: \"953954e4-b723-467d-afcb-ae1b158de42e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:05.922047 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.922013 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbr9t\" (UniqueName: \"kubernetes.io/projected/953954e4-b723-467d-afcb-ae1b158de42e-kube-api-access-kbr9t\") pod \"kube-state-metrics-7479c89684-6lb8r\" (UID: \"953954e4-b723-467d-afcb-ae1b158de42e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:05.922106 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.922045 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-node-exporter-textfile\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:05.922106 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.922069 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-sys\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:05.922166 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.922119 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-node-exporter-tls\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:05.922166 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.922148 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8b92090-4fd9-4b07-973e-79e7cd43cfb5-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-nf9vc\" (UID: \"b8b92090-4fd9-4b07-973e-79e7cd43cfb5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" Apr 16 18:33:05.922237 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.922174 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-root\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:05.922237 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.922202 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b8b92090-4fd9-4b07-973e-79e7cd43cfb5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-nf9vc\" (UID: \"b8b92090-4fd9-4b07-973e-79e7cd43cfb5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" Apr 16 18:33:05.922237 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.922221 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b8b92090-4fd9-4b07-973e-79e7cd43cfb5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-nf9vc\" (UID: \"b8b92090-4fd9-4b07-973e-79e7cd43cfb5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" Apr 16 18:33:05.922344 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.922240 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/953954e4-b723-467d-afcb-ae1b158de42e-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-6lb8r\" (UID: \"953954e4-b723-467d-afcb-ae1b158de42e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:05.922344 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.922258 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/953954e4-b723-467d-afcb-ae1b158de42e-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-6lb8r\" (UID: \"953954e4-b723-467d-afcb-ae1b158de42e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:05.922344 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.922274 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx69k\" (UniqueName: \"kubernetes.io/projected/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-kube-api-access-qx69k\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:05.922445 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.922350 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq4tj\" (UniqueName: \"kubernetes.io/projected/b8b92090-4fd9-4b07-973e-79e7cd43cfb5-kube-api-access-jq4tj\") pod \"openshift-state-metrics-5669946b84-nf9vc\" (UID: \"b8b92090-4fd9-4b07-973e-79e7cd43cfb5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" Apr 16 18:33:05.922445 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.922381 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/953954e4-b723-467d-afcb-ae1b158de42e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-6lb8r\" (UID: \"953954e4-b723-467d-afcb-ae1b158de42e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:05.922445 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.922401 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/953954e4-b723-467d-afcb-ae1b158de42e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-6lb8r\" (UID: \"953954e4-b723-467d-afcb-ae1b158de42e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:05.922445 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:05.922424 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-metrics-client-ca\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.022824 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.022787 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-node-exporter-textfile\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.022824 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.022826 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-sys\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.023086 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.022846 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-node-exporter-tls\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.023086 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.022866 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8b92090-4fd9-4b07-973e-79e7cd43cfb5-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-nf9vc\" (UID: \"b8b92090-4fd9-4b07-973e-79e7cd43cfb5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" Apr 16 18:33:06.023086 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.022892 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-root\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.023086 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.022911 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b8b92090-4fd9-4b07-973e-79e7cd43cfb5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-nf9vc\" (UID: \"b8b92090-4fd9-4b07-973e-79e7cd43cfb5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" Apr 16 18:33:06.023086 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.022956 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b8b92090-4fd9-4b07-973e-79e7cd43cfb5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-nf9vc\" (UID: \"b8b92090-4fd9-4b07-973e-79e7cd43cfb5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" Apr 16 18:33:06.023086 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.022955 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-sys\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.023086 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.023042 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-root\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.023086 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:33:06.023059 2570 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 18:33:06.023433 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.023102 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/953954e4-b723-467d-afcb-ae1b158de42e-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-6lb8r\" (UID: \"953954e4-b723-467d-afcb-ae1b158de42e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:06.023433 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:33:06.023133 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8b92090-4fd9-4b07-973e-79e7cd43cfb5-openshift-state-metrics-tls podName:b8b92090-4fd9-4b07-973e-79e7cd43cfb5 nodeName:}" failed. No retries permitted until 2026-04-16 18:33:06.523110067 +0000 UTC m=+186.240967314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/b8b92090-4fd9-4b07-973e-79e7cd43cfb5-openshift-state-metrics-tls") pod "openshift-state-metrics-5669946b84-nf9vc" (UID: "b8b92090-4fd9-4b07-973e-79e7cd43cfb5") : secret "openshift-state-metrics-tls" not found Apr 16 18:33:06.023433 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.023168 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/953954e4-b723-467d-afcb-ae1b158de42e-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-6lb8r\" (UID: \"953954e4-b723-467d-afcb-ae1b158de42e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:06.023433 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.023201 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx69k\" (UniqueName: \"kubernetes.io/projected/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-kube-api-access-qx69k\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.023433 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.023251 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq4tj\" (UniqueName: \"kubernetes.io/projected/b8b92090-4fd9-4b07-973e-79e7cd43cfb5-kube-api-access-jq4tj\") pod \"openshift-state-metrics-5669946b84-nf9vc\" (UID: \"b8b92090-4fd9-4b07-973e-79e7cd43cfb5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" Apr 16 18:33:06.023433 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.023287 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/953954e4-b723-467d-afcb-ae1b158de42e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-6lb8r\" (UID: \"953954e4-b723-467d-afcb-ae1b158de42e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:06.023433 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.023319 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/953954e4-b723-467d-afcb-ae1b158de42e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-6lb8r\" (UID: \"953954e4-b723-467d-afcb-ae1b158de42e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:06.023433 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.023347 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-metrics-client-ca\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.023433 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.023394 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.023433 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.023434 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-node-exporter-wtmp\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.023911 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.023461 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-node-exporter-accelerators-collector-config\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.023911 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.023493 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/953954e4-b723-467d-afcb-ae1b158de42e-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-6lb8r\" (UID: \"953954e4-b723-467d-afcb-ae1b158de42e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:06.023911 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.023536 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbr9t\" (UniqueName: \"kubernetes.io/projected/953954e4-b723-467d-afcb-ae1b158de42e-kube-api-access-kbr9t\") pod \"kube-state-metrics-7479c89684-6lb8r\" (UID: \"953954e4-b723-467d-afcb-ae1b158de42e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:06.023911 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.023653 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b8b92090-4fd9-4b07-973e-79e7cd43cfb5-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-nf9vc\" (UID: \"b8b92090-4fd9-4b07-973e-79e7cd43cfb5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" Apr 16 18:33:06.023911 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.023746 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-node-exporter-textfile\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.023911 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.023771 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/953954e4-b723-467d-afcb-ae1b158de42e-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-6lb8r\" (UID: \"953954e4-b723-467d-afcb-ae1b158de42e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:06.023911 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.023900 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-node-exporter-wtmp\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.024279 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.024101 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/953954e4-b723-467d-afcb-ae1b158de42e-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-6lb8r\" (UID: \"953954e4-b723-467d-afcb-ae1b158de42e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:06.024410 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.024384 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-metrics-client-ca\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.024610 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.024585 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-node-exporter-accelerators-collector-config\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.024996 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.024973 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/953954e4-b723-467d-afcb-ae1b158de42e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-6lb8r\" (UID: \"953954e4-b723-467d-afcb-ae1b158de42e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:06.025824 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.025802 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-node-exporter-tls\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.026021 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.026002 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b8b92090-4fd9-4b07-973e-79e7cd43cfb5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-nf9vc\" (UID: \"b8b92090-4fd9-4b07-973e-79e7cd43cfb5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" Apr 16 18:33:06.026356 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.026331 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.026476 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.026458 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/953954e4-b723-467d-afcb-ae1b158de42e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-6lb8r\" (UID: \"953954e4-b723-467d-afcb-ae1b158de42e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:06.026531 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.026515 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/953954e4-b723-467d-afcb-ae1b158de42e-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-6lb8r\" (UID: \"953954e4-b723-467d-afcb-ae1b158de42e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:06.038335 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.038309 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq4tj\" (UniqueName: \"kubernetes.io/projected/b8b92090-4fd9-4b07-973e-79e7cd43cfb5-kube-api-access-jq4tj\") pod \"openshift-state-metrics-5669946b84-nf9vc\" (UID: \"b8b92090-4fd9-4b07-973e-79e7cd43cfb5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" Apr 16 18:33:06.039117 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.039097 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbr9t\" (UniqueName: \"kubernetes.io/projected/953954e4-b723-467d-afcb-ae1b158de42e-kube-api-access-kbr9t\") pod \"kube-state-metrics-7479c89684-6lb8r\" (UID: \"953954e4-b723-467d-afcb-ae1b158de42e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:06.039486 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.039466 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx69k\" (UniqueName: \"kubernetes.io/projected/eacdc74c-f6bb-4fce-a567-ab934f47b2c9-kube-api-access-qx69k\") pod \"node-exporter-5sb9j\" (UID: \"eacdc74c-f6bb-4fce-a567-ab934f47b2c9\") " pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.184555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.184458 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5sb9j" Apr 16 18:33:06.194163 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:33:06.194135 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeacdc74c_f6bb_4fce_a567_ab934f47b2c9.slice/crio-fb488d66b656087e4eb78286d6a129ac4586b2de71037e744ba664ee6af294ea WatchSource:0}: Error finding container fb488d66b656087e4eb78286d6a129ac4586b2de71037e744ba664ee6af294ea: Status 404 returned error can't find the container with id fb488d66b656087e4eb78286d6a129ac4586b2de71037e744ba664ee6af294ea Apr 16 18:33:06.204396 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.204373 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" Apr 16 18:33:06.267272 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.267231 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5sb9j" event={"ID":"eacdc74c-f6bb-4fce-a567-ab934f47b2c9","Type":"ContainerStarted","Data":"fb488d66b656087e4eb78286d6a129ac4586b2de71037e744ba664ee6af294ea"} Apr 16 18:33:06.332151 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.332037 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-6lb8r"] Apr 16 18:33:06.334572 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:33:06.334545 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod953954e4_b723_467d_afcb_ae1b158de42e.slice/crio-281d05fb985926d32a603c7bcd91298ee2b84de0b1a4c234e1a5c089d287cffd WatchSource:0}: Error finding container 281d05fb985926d32a603c7bcd91298ee2b84de0b1a4c234e1a5c089d287cffd: Status 404 returned error can't find the container with id 281d05fb985926d32a603c7bcd91298ee2b84de0b1a4c234e1a5c089d287cffd Apr 16 18:33:06.527574 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.527495 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b8b92090-4fd9-4b07-973e-79e7cd43cfb5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-nf9vc\" (UID: \"b8b92090-4fd9-4b07-973e-79e7cd43cfb5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" Apr 16 18:33:06.529808 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.529776 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b8b92090-4fd9-4b07-973e-79e7cd43cfb5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-nf9vc\" (UID: \"b8b92090-4fd9-4b07-973e-79e7cd43cfb5\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" Apr 16 18:33:06.777866 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.777792 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" Apr 16 18:33:06.922499 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:06.922443 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc"] Apr 16 18:33:06.927090 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:33:06.926638 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8b92090_4fd9_4b07_973e_79e7cd43cfb5.slice/crio-e6fffca5f9c55503cab433aa1a9fa85afc70bf5ab127a5fe0ec6267320ff739b WatchSource:0}: Error finding container e6fffca5f9c55503cab433aa1a9fa85afc70bf5ab127a5fe0ec6267320ff739b: Status 404 returned error can't find the container with id e6fffca5f9c55503cab433aa1a9fa85afc70bf5ab127a5fe0ec6267320ff739b Apr 16 18:33:07.271823 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:07.271791 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" event={"ID":"b8b92090-4fd9-4b07-973e-79e7cd43cfb5","Type":"ContainerStarted","Data":"f2b1524abd365ac254955d1d297b4bd0faa10e6a02d6dce318e0b33a2c7ac2e1"} Apr 16 18:33:07.272015 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:07.271830 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" event={"ID":"b8b92090-4fd9-4b07-973e-79e7cd43cfb5","Type":"ContainerStarted","Data":"36eda9f140b894b79fcaf126ff21527a1075c66607dda90f7457e577fbd5d37f"} Apr 16 18:33:07.272015 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:07.271845 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" event={"ID":"b8b92090-4fd9-4b07-973e-79e7cd43cfb5","Type":"ContainerStarted","Data":"e6fffca5f9c55503cab433aa1a9fa85afc70bf5ab127a5fe0ec6267320ff739b"} Apr 16 18:33:07.272865 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:07.272844 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" event={"ID":"953954e4-b723-467d-afcb-ae1b158de42e","Type":"ContainerStarted","Data":"281d05fb985926d32a603c7bcd91298ee2b84de0b1a4c234e1a5c089d287cffd"} Apr 16 18:33:08.278912 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:08.278840 2570 generic.go:358] "Generic (PLEG): container finished" podID="eacdc74c-f6bb-4fce-a567-ab934f47b2c9" containerID="e3f69bc0f5635d3c7c4bcc592a764d04b6de819ba70348519dfdcc73c325c9b4" exitCode=0 Apr 16 18:33:08.279352 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:08.278950 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5sb9j" event={"ID":"eacdc74c-f6bb-4fce-a567-ab934f47b2c9","Type":"ContainerDied","Data":"e3f69bc0f5635d3c7c4bcc592a764d04b6de819ba70348519dfdcc73c325c9b4"} Apr 16 18:33:08.280944 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:08.280907 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" event={"ID":"953954e4-b723-467d-afcb-ae1b158de42e","Type":"ContainerStarted","Data":"565c423eb5d3b6a4b53f62048051bf081e6ea9e005e6cbcac05ebe24443c8c98"} Apr 16 18:33:08.281032 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:08.280950 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" event={"ID":"953954e4-b723-467d-afcb-ae1b158de42e","Type":"ContainerStarted","Data":"f8a546ae4f59a8cc20a6cfe86f25f3749d24a164ce5c327b16f39abc4fdbd777"} Apr 16 18:33:08.281032 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:08.280960 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" event={"ID":"953954e4-b723-467d-afcb-ae1b158de42e","Type":"ContainerStarted","Data":"e2303361d9ef3079cc75ffca898e043c55b462cd1761be21838480a08a4257d1"} Apr 16 18:33:08.489266 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:08.489227 2570 patch_prober.go:28] interesting pod/image-registry-8bf895f67-vrm6x container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:33:08.489411 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:08.489290 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" podUID="77d682f3-5ff1-4f82-b33e-e8723e48e5f9" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:33:09.286256 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:09.286212 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5sb9j" event={"ID":"eacdc74c-f6bb-4fce-a567-ab934f47b2c9","Type":"ContainerStarted","Data":"1df701d634efd1fae6a1b577441c791dea1914b1cbd43ee611de6a8cf072717f"} Apr 16 18:33:09.286256 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:09.286259 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5sb9j" event={"ID":"eacdc74c-f6bb-4fce-a567-ab934f47b2c9","Type":"ContainerStarted","Data":"84d37fa67b1cf8f9f20955814b92d02dc607453784499859446695c0d07d8a76"} Apr 16 18:33:09.288107 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:09.288077 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" event={"ID":"b8b92090-4fd9-4b07-973e-79e7cd43cfb5","Type":"ContainerStarted","Data":"1e38c69f54593e8aedee6b4c26d894853394bfe25f0c6244c2d49263964ea895"} Apr 16 18:33:09.322363 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:09.322311 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-6lb8r" podStartSLOduration=3.024700591 podStartE2EDuration="4.32229975s" podCreationTimestamp="2026-04-16 18:33:05 +0000 UTC" firstStartedPulling="2026-04-16 18:33:06.336399332 +0000 UTC m=+186.054256572" lastFinishedPulling="2026-04-16 18:33:07.633998477 +0000 UTC m=+187.351855731" observedRunningTime="2026-04-16 18:33:08.319561284 +0000 UTC m=+188.037418547" watchObservedRunningTime="2026-04-16 18:33:09.32229975 +0000 UTC m=+189.040157011" Apr 16 18:33:09.323191 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:09.323159 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-5sb9j" podStartSLOduration=2.887138167 podStartE2EDuration="4.323148673s" podCreationTimestamp="2026-04-16 18:33:05 +0000 UTC" firstStartedPulling="2026-04-16 18:33:06.195888943 +0000 UTC m=+185.913746183" lastFinishedPulling="2026-04-16 18:33:07.631899444 +0000 UTC m=+187.349756689" observedRunningTime="2026-04-16 18:33:09.322067398 +0000 UTC m=+189.039924660" watchObservedRunningTime="2026-04-16 18:33:09.323148673 +0000 UTC m=+189.041005935" Apr 16 18:33:09.374840 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:09.374792 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-nf9vc" podStartSLOduration=3.176892743 podStartE2EDuration="4.374778747s" podCreationTimestamp="2026-04-16 18:33:05 +0000 UTC" firstStartedPulling="2026-04-16 18:33:07.118958953 +0000 UTC m=+186.836816199" lastFinishedPulling="2026-04-16 18:33:08.316844951 +0000 UTC m=+188.034702203" observedRunningTime="2026-04-16 18:33:09.373292222 +0000 UTC m=+189.091149485" watchObservedRunningTime="2026-04-16 18:33:09.374778747 +0000 UTC m=+189.092636011" Apr 16 18:33:10.624598 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:10.624567 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-llkbq"] Apr 16 18:33:10.627716 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:10.627699 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-llkbq" Apr 16 18:33:10.630512 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:10.630489 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 18:33:10.630615 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:10.630490 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-ff22r\"" Apr 16 18:33:10.636512 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:10.636490 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-llkbq"] Apr 16 18:33:10.664712 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:10.664680 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0640a4da-0593-4262-9936-a12646e54d28-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-llkbq\" (UID: \"0640a4da-0593-4262-9936-a12646e54d28\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-llkbq" Apr 16 18:33:10.765192 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:10.765147 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0640a4da-0593-4262-9936-a12646e54d28-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-llkbq\" (UID: \"0640a4da-0593-4262-9936-a12646e54d28\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-llkbq" Apr 16 18:33:10.765368 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:33:10.765294 2570 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 18:33:10.765428 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:33:10.765391 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0640a4da-0593-4262-9936-a12646e54d28-monitoring-plugin-cert podName:0640a4da-0593-4262-9936-a12646e54d28 nodeName:}" failed. No retries permitted until 2026-04-16 18:33:11.265369265 +0000 UTC m=+190.983226510 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/0640a4da-0593-4262-9936-a12646e54d28-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-llkbq" (UID: "0640a4da-0593-4262-9936-a12646e54d28") : secret "monitoring-plugin-cert" not found Apr 16 18:33:11.158404 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.158370 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-794d85457c-r46ps"] Apr 16 18:33:11.161623 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.161603 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.164475 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.164449 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 18:33:11.164617 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.164502 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 18:33:11.164617 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.164536 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 18:33:11.164617 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.164551 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 18:33:11.164768 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.164634 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 18:33:11.164895 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.164879 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-g5gbx\"" Apr 16 18:33:11.170762 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.170741 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 18:33:11.180710 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.180688 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-794d85457c-r46ps"] Apr 16 18:33:11.269141 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.269110 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2p9f\" (UniqueName: \"kubernetes.io/projected/49fd2964-d995-433c-a395-164796728457-kube-api-access-m2p9f\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.269304 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.269163 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49fd2964-d995-433c-a395-164796728457-serving-certs-ca-bundle\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.269304 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.269188 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49fd2964-d995-433c-a395-164796728457-telemeter-trusted-ca-bundle\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.269304 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.269239 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/49fd2964-d995-433c-a395-164796728457-secret-telemeter-client\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.269304 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.269269 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/49fd2964-d995-433c-a395-164796728457-federate-client-tls\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.269304 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.269287 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/49fd2964-d995-433c-a395-164796728457-metrics-client-ca\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.269533 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.269316 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0640a4da-0593-4262-9936-a12646e54d28-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-llkbq\" (UID: \"0640a4da-0593-4262-9936-a12646e54d28\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-llkbq" Apr 16 18:33:11.269533 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.269342 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/49fd2964-d995-433c-a395-164796728457-telemeter-client-tls\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.269533 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.269384 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/49fd2964-d995-433c-a395-164796728457-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.271747 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.271723 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0640a4da-0593-4262-9936-a12646e54d28-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-llkbq\" (UID: \"0640a4da-0593-4262-9936-a12646e54d28\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-llkbq" Apr 16 18:33:11.369910 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.369874 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2p9f\" (UniqueName: \"kubernetes.io/projected/49fd2964-d995-433c-a395-164796728457-kube-api-access-m2p9f\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.370127 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.369953 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49fd2964-d995-433c-a395-164796728457-serving-certs-ca-bundle\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.370127 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.369987 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49fd2964-d995-433c-a395-164796728457-telemeter-trusted-ca-bundle\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.370127 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.370022 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/49fd2964-d995-433c-a395-164796728457-secret-telemeter-client\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.370278 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.370132 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/49fd2964-d995-433c-a395-164796728457-federate-client-tls\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.370278 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.370166 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/49fd2964-d995-433c-a395-164796728457-metrics-client-ca\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.370278 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.370228 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/49fd2964-d995-433c-a395-164796728457-telemeter-client-tls\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.370425 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.370310 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/49fd2964-d995-433c-a395-164796728457-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.371110 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.371083 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49fd2964-d995-433c-a395-164796728457-serving-certs-ca-bundle\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.371277 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.371252 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49fd2964-d995-433c-a395-164796728457-telemeter-trusted-ca-bundle\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.371451 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.371427 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/49fd2964-d995-433c-a395-164796728457-metrics-client-ca\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.373382 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.373352 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/49fd2964-d995-433c-a395-164796728457-federate-client-tls\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.376445 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.373776 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/49fd2964-d995-433c-a395-164796728457-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.376445 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.374139 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/49fd2964-d995-433c-a395-164796728457-telemeter-client-tls\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.376445 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.375345 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/49fd2964-d995-433c-a395-164796728457-secret-telemeter-client\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.379570 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.379544 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2p9f\" (UniqueName: \"kubernetes.io/projected/49fd2964-d995-433c-a395-164796728457-kube-api-access-m2p9f\") pod \"telemeter-client-794d85457c-r46ps\" (UID: \"49fd2964-d995-433c-a395-164796728457\") " pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.472854 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.472749 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" Apr 16 18:33:11.537506 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.537475 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-llkbq" Apr 16 18:33:11.614160 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.613905 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-794d85457c-r46ps"] Apr 16 18:33:11.616206 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:33:11.615337 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49fd2964_d995_433c_a395_164796728457.slice/crio-63d7da80bb97de9a75e287b33ae8e77ac1c1a9a37666b2530c38bbfd305effa0 WatchSource:0}: Error finding container 63d7da80bb97de9a75e287b33ae8e77ac1c1a9a37666b2530c38bbfd305effa0: Status 404 returned error can't find the container with id 63d7da80bb97de9a75e287b33ae8e77ac1c1a9a37666b2530c38bbfd305effa0 Apr 16 18:33:11.671282 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:11.671248 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-llkbq"] Apr 16 18:33:11.674141 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:33:11.674111 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0640a4da_0593_4262_9936_a12646e54d28.slice/crio-a69c7457b25e622e05ab5053f136bfed9ea11e70fe9b147b0aa88758e7d3fe67 WatchSource:0}: Error finding container a69c7457b25e622e05ab5053f136bfed9ea11e70fe9b147b0aa88758e7d3fe67: Status 404 returned error can't find the container with id a69c7457b25e622e05ab5053f136bfed9ea11e70fe9b147b0aa88758e7d3fe67 Apr 16 18:33:12.166174 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.164621 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:33:12.170821 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.170790 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.174031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.173991 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:33:12.174155 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.174074 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:33:12.174344 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.174328 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-245f5\"" Apr 16 18:33:12.174434 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.174371 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:33:12.174628 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.174605 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:33:12.178698 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.176731 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:33:12.178698 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.177020 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-6lb3ca33lsk0t\"" Apr 16 18:33:12.178698 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.177839 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:33:12.178698 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.178137 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:33:12.178698 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.178335 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:33:12.178698 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.178537 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:33:12.180785 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.179825 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:33:12.180785 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.180285 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:33:12.180785 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.180477 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:33:12.180785 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.180666 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:33:12.189845 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.188648 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:33:12.278470 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.278312 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.278470 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.278362 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.278470 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.278397 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4mkk\" (UniqueName: \"kubernetes.io/projected/ebb74076-3b69-4b37-a937-1210ede643cb-kube-api-access-z4mkk\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.278470 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.278427 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.278470 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.278457 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.278470 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.278481 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.278891 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.278526 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.278891 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.278553 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.278891 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.278592 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.278891 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.278616 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.278891 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.278723 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.278891 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.278769 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-web-config\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.278891 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.278796 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ebb74076-3b69-4b37-a937-1210ede643cb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.278891 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.278830 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-config\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.278891 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.278866 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.279315 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.278946 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ebb74076-3b69-4b37-a937-1210ede643cb-config-out\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.279315 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.278971 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.279315 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.279015 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ebb74076-3b69-4b37-a937-1210ede643cb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.298803 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.298490 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-llkbq" event={"ID":"0640a4da-0593-4262-9936-a12646e54d28","Type":"ContainerStarted","Data":"a69c7457b25e622e05ab5053f136bfed9ea11e70fe9b147b0aa88758e7d3fe67"} Apr 16 18:33:12.300284 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.300254 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" event={"ID":"49fd2964-d995-433c-a395-164796728457","Type":"ContainerStarted","Data":"63d7da80bb97de9a75e287b33ae8e77ac1c1a9a37666b2530c38bbfd305effa0"} Apr 16 18:33:12.381213 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380090 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ebb74076-3b69-4b37-a937-1210ede643cb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.381213 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380141 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-config\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.381213 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380172 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.381213 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380208 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ebb74076-3b69-4b37-a937-1210ede643cb-config-out\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.381213 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380231 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.381213 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380272 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ebb74076-3b69-4b37-a937-1210ede643cb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.381213 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380320 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.381213 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380345 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.381213 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380380 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4mkk\" (UniqueName: \"kubernetes.io/projected/ebb74076-3b69-4b37-a937-1210ede643cb-kube-api-access-z4mkk\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.381213 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380408 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.381213 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380440 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.381213 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380465 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.381213 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380512 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ebb74076-3b69-4b37-a937-1210ede643cb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.381213 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380515 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.381213 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380569 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.381213 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380611 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.381213 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380637 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.382331 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380670 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.382331 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.380718 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-web-config\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.382331 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.381168 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.382331 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.382256 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.383457 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.382630 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.385673 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.385647 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.386250 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.386109 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-web-config\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.386250 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.386209 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.389312 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.389292 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.393595 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.392676 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.393595 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.393056 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.393595 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.393144 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.393595 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.393257 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ebb74076-3b69-4b37-a937-1210ede643cb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.393595 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.393479 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.393595 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.393498 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ebb74076-3b69-4b37-a937-1210ede643cb-config-out\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.393595 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.393519 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-config\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.393595 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.393561 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.394824 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.394741 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.398064 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.398028 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4mkk\" (UniqueName: \"kubernetes.io/projected/ebb74076-3b69-4b37-a937-1210ede643cb-kube-api-access-z4mkk\") pod \"prometheus-k8s-0\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.486142 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.486061 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:12.648327 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:12.648284 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:33:13.001333 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:33:13.001297 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebb74076_3b69_4b37_a937_1210ede643cb.slice/crio-17147f4763ae56490d1f5b63031faeb57781d370ed8c9bca217311992a75ef82 WatchSource:0}: Error finding container 17147f4763ae56490d1f5b63031faeb57781d370ed8c9bca217311992a75ef82: Status 404 returned error can't find the container with id 17147f4763ae56490d1f5b63031faeb57781d370ed8c9bca217311992a75ef82 Apr 16 18:33:13.305152 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:13.305060 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-llkbq" event={"ID":"0640a4da-0593-4262-9936-a12646e54d28","Type":"ContainerStarted","Data":"98ee98ed3b131548523fafdddaed16a36635b411ac082762e885cdde16e49082"} Apr 16 18:33:13.305152 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:13.305106 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-llkbq" Apr 16 18:33:13.306419 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:13.306382 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ebb74076-3b69-4b37-a937-1210ede643cb","Type":"ContainerStarted","Data":"17147f4763ae56490d1f5b63031faeb57781d370ed8c9bca217311992a75ef82"} Apr 16 18:33:13.310586 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:13.310534 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-llkbq" Apr 16 18:33:13.324002 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:13.323951 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-llkbq" podStartSLOduration=1.950701311 podStartE2EDuration="3.323913954s" podCreationTimestamp="2026-04-16 18:33:10 +0000 UTC" firstStartedPulling="2026-04-16 18:33:11.675841266 +0000 UTC m=+191.393698507" lastFinishedPulling="2026-04-16 18:33:13.049053909 +0000 UTC m=+192.766911150" observedRunningTime="2026-04-16 18:33:13.322560463 +0000 UTC m=+193.040417726" watchObservedRunningTime="2026-04-16 18:33:13.323913954 +0000 UTC m=+193.041771220" Apr 16 18:33:14.311124 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:14.311098 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ebb74076-3b69-4b37-a937-1210ede643cb","Type":"ContainerStarted","Data":"39472452e23d40a05b88c7b9349fa086de6c8ce9c19556d65920ac7591593f35"} Apr 16 18:33:14.313145 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:14.313122 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" event={"ID":"49fd2964-d995-433c-a395-164796728457","Type":"ContainerStarted","Data":"faa5142972bb2df27bb7b1b13081cbd926867dff2157a7bb5ca4257bbebdbe3f"} Apr 16 18:33:14.313238 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:14.313153 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" event={"ID":"49fd2964-d995-433c-a395-164796728457","Type":"ContainerStarted","Data":"062ca7f769d758120b0f2b318f8af400a1a34c5009864326472c1e97b73e2e60"} Apr 16 18:33:15.317306 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:15.317271 2570 generic.go:358] "Generic (PLEG): container finished" podID="ebb74076-3b69-4b37-a937-1210ede643cb" containerID="39472452e23d40a05b88c7b9349fa086de6c8ce9c19556d65920ac7591593f35" exitCode=0 Apr 16 18:33:15.317707 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:15.317361 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ebb74076-3b69-4b37-a937-1210ede643cb","Type":"ContainerDied","Data":"39472452e23d40a05b88c7b9349fa086de6c8ce9c19556d65920ac7591593f35"} Apr 16 18:33:15.319443 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:15.319413 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" event={"ID":"49fd2964-d995-433c-a395-164796728457","Type":"ContainerStarted","Data":"e2a0433ed125470bafdf3cbcfa09f0008a6cc515d9e66b12b2277243014aa121"} Apr 16 18:33:15.372242 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:15.372192 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-794d85457c-r46ps" podStartSLOduration=1.7629704720000001 podStartE2EDuration="4.372176514s" podCreationTimestamp="2026-04-16 18:33:11 +0000 UTC" firstStartedPulling="2026-04-16 18:33:11.617945538 +0000 UTC m=+191.335802784" lastFinishedPulling="2026-04-16 18:33:14.227151587 +0000 UTC m=+193.945008826" observedRunningTime="2026-04-16 18:33:15.370857327 +0000 UTC m=+195.088714598" watchObservedRunningTime="2026-04-16 18:33:15.372176514 +0000 UTC m=+195.090033775" Apr 16 18:33:18.332481 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:18.332449 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ebb74076-3b69-4b37-a937-1210ede643cb","Type":"ContainerStarted","Data":"dd2b1962999cf9d49b4e934f9e0ba2679b279a81ee78f7eafc03ef0b82573b15"} Apr 16 18:33:18.332481 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:18.332484 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ebb74076-3b69-4b37-a937-1210ede643cb","Type":"ContainerStarted","Data":"2e442cfa946c283a7b813734f4d3399fcf190e904d66b3db75eb1a540d130074"} Apr 16 18:33:18.488872 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:18.488841 2570 patch_prober.go:28] interesting pod/image-registry-8bf895f67-vrm6x container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:33:18.489024 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:18.488892 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" podUID="77d682f3-5ff1-4f82-b33e-e8723e48e5f9" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:33:20.340606 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:20.340571 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ebb74076-3b69-4b37-a937-1210ede643cb","Type":"ContainerStarted","Data":"231b4172386e933c2c53fd4cc8685b9546b894bed88471b590fb3b96a9fa49cc"} Apr 16 18:33:20.341015 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:20.340613 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ebb74076-3b69-4b37-a937-1210ede643cb","Type":"ContainerStarted","Data":"d2c8c3c3f8017a74408b5bb1a6c7ad3681e4eda7488aa6c54c49a7a34f0757c1"} Apr 16 18:33:20.341015 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:20.340627 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ebb74076-3b69-4b37-a937-1210ede643cb","Type":"ContainerStarted","Data":"cd65dd92f42e10208b857bbc650d604dc7d310e2dc4192a5184ebf9bb3fee0f9"} Apr 16 18:33:20.341015 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:20.340640 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ebb74076-3b69-4b37-a937-1210ede643cb","Type":"ContainerStarted","Data":"f94262f5948f333c96c8e7d365b1e0d95df4cdc48c372e2272005d92ad757801"} Apr 16 18:33:20.370725 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:20.370677 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.503739012 podStartE2EDuration="8.370662412s" podCreationTimestamp="2026-04-16 18:33:12 +0000 UTC" firstStartedPulling="2026-04-16 18:33:13.003562924 +0000 UTC m=+192.721420164" lastFinishedPulling="2026-04-16 18:33:19.870486324 +0000 UTC m=+199.588343564" observedRunningTime="2026-04-16 18:33:20.368836662 +0000 UTC m=+200.086693924" watchObservedRunningTime="2026-04-16 18:33:20.370662412 +0000 UTC m=+200.088519673" Apr 16 18:33:22.487111 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:22.487075 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:23.504383 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.504319 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" podUID="77d682f3-5ff1-4f82-b33e-e8723e48e5f9" containerName="registry" containerID="cri-o://f0977c9b484d1a318a93497f8bb19c092948b865a2ada0066c87d2b7076362be" gracePeriod=30 Apr 16 18:33:23.737514 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.737492 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:33:23.794522 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.794448 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-bound-sa-token\") pod \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " Apr 16 18:33:23.794522 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.794483 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-image-registry-private-configuration\") pod \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " Apr 16 18:33:23.794522 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.794507 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls\") pod \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " Apr 16 18:33:23.794788 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.794543 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-installation-pull-secrets\") pod \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " Apr 16 18:33:23.794788 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.794597 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-certificates\") pod \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " Apr 16 18:33:23.794788 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.794629 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x5m2\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-kube-api-access-2x5m2\") pod \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " Apr 16 18:33:23.794788 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.794657 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-trusted-ca\") pod \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " Apr 16 18:33:23.794788 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.794726 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-ca-trust-extracted\") pod \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\" (UID: \"77d682f3-5ff1-4f82-b33e-e8723e48e5f9\") " Apr 16 18:33:23.795155 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.795127 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "77d682f3-5ff1-4f82-b33e-e8723e48e5f9" (UID: "77d682f3-5ff1-4f82-b33e-e8723e48e5f9"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:33:23.795322 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.795295 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "77d682f3-5ff1-4f82-b33e-e8723e48e5f9" (UID: "77d682f3-5ff1-4f82-b33e-e8723e48e5f9"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:33:23.797069 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.797033 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-kube-api-access-2x5m2" (OuterVolumeSpecName: "kube-api-access-2x5m2") pod "77d682f3-5ff1-4f82-b33e-e8723e48e5f9" (UID: "77d682f3-5ff1-4f82-b33e-e8723e48e5f9"). InnerVolumeSpecName "kube-api-access-2x5m2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:33:23.797197 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.797044 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "77d682f3-5ff1-4f82-b33e-e8723e48e5f9" (UID: "77d682f3-5ff1-4f82-b33e-e8723e48e5f9"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:33:23.797197 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.797135 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "77d682f3-5ff1-4f82-b33e-e8723e48e5f9" (UID: "77d682f3-5ff1-4f82-b33e-e8723e48e5f9"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:33:23.797197 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.797178 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "77d682f3-5ff1-4f82-b33e-e8723e48e5f9" (UID: "77d682f3-5ff1-4f82-b33e-e8723e48e5f9"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:33:23.797548 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.797523 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "77d682f3-5ff1-4f82-b33e-e8723e48e5f9" (UID: "77d682f3-5ff1-4f82-b33e-e8723e48e5f9"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:33:23.803744 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.803721 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "77d682f3-5ff1-4f82-b33e-e8723e48e5f9" (UID: "77d682f3-5ff1-4f82-b33e-e8723e48e5f9"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:33:23.895441 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.895406 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-certificates\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:33:23.895441 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.895436 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2x5m2\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-kube-api-access-2x5m2\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:33:23.895441 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.895449 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-trusted-ca\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:33:23.895659 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.895462 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-ca-trust-extracted\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:33:23.895659 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.895474 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-bound-sa-token\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:33:23.895659 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.895486 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-image-registry-private-configuration\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:33:23.895659 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.895497 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-registry-tls\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:33:23.895659 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:23.895510 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/77d682f3-5ff1-4f82-b33e-e8723e48e5f9-installation-pull-secrets\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:33:24.353266 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:24.353237 2570 generic.go:358] "Generic (PLEG): container finished" podID="77d682f3-5ff1-4f82-b33e-e8723e48e5f9" containerID="f0977c9b484d1a318a93497f8bb19c092948b865a2ada0066c87d2b7076362be" exitCode=0 Apr 16 18:33:24.353439 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:24.353298 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" Apr 16 18:33:24.353439 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:24.353314 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" event={"ID":"77d682f3-5ff1-4f82-b33e-e8723e48e5f9","Type":"ContainerDied","Data":"f0977c9b484d1a318a93497f8bb19c092948b865a2ada0066c87d2b7076362be"} Apr 16 18:33:24.353439 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:24.353341 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8bf895f67-vrm6x" event={"ID":"77d682f3-5ff1-4f82-b33e-e8723e48e5f9","Type":"ContainerDied","Data":"fc45b1a4953d8e046aa3acc064fcdd8f2b569bd25feb323c9a83897841914bcd"} Apr 16 18:33:24.353439 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:24.353356 2570 scope.go:117] "RemoveContainer" containerID="f0977c9b484d1a318a93497f8bb19c092948b865a2ada0066c87d2b7076362be" Apr 16 18:33:24.361588 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:24.361572 2570 scope.go:117] "RemoveContainer" containerID="f0977c9b484d1a318a93497f8bb19c092948b865a2ada0066c87d2b7076362be" Apr 16 18:33:24.361835 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:33:24.361814 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0977c9b484d1a318a93497f8bb19c092948b865a2ada0066c87d2b7076362be\": container with ID starting with f0977c9b484d1a318a93497f8bb19c092948b865a2ada0066c87d2b7076362be not found: ID does not exist" containerID="f0977c9b484d1a318a93497f8bb19c092948b865a2ada0066c87d2b7076362be" Apr 16 18:33:24.361905 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:24.361843 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0977c9b484d1a318a93497f8bb19c092948b865a2ada0066c87d2b7076362be"} err="failed to get container status \"f0977c9b484d1a318a93497f8bb19c092948b865a2ada0066c87d2b7076362be\": rpc error: code = NotFound desc = could not find container \"f0977c9b484d1a318a93497f8bb19c092948b865a2ada0066c87d2b7076362be\": container with ID starting with f0977c9b484d1a318a93497f8bb19c092948b865a2ada0066c87d2b7076362be not found: ID does not exist" Apr 16 18:33:24.375423 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:24.375403 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8bf895f67-vrm6x"] Apr 16 18:33:24.381555 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:24.381535 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-8bf895f67-vrm6x"] Apr 16 18:33:24.774773 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:33:24.774696 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77d682f3-5ff1-4f82-b33e-e8723e48e5f9" path="/var/lib/kubelet/pods/77d682f3-5ff1-4f82-b33e-e8723e48e5f9/volumes" Apr 16 18:34:05.750173 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:05.750141 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cljk9_3cdc1460-1781-45b3-ad12-0173537882af/dns-node-resolver/0.log" Apr 16 18:34:12.486596 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:12.486524 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:12.504986 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:12.504961 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:12.510672 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:12.510649 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs\") pod \"network-metrics-daemon-kc2vf\" (UID: \"1d7d2281-07bb-4906-844c-f53fbfe57143\") " pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:34:12.512749 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:12.512723 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d7d2281-07bb-4906-844c-f53fbfe57143-metrics-certs\") pod \"network-metrics-daemon-kc2vf\" (UID: \"1d7d2281-07bb-4906-844c-f53fbfe57143\") " pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:34:12.574337 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:12.574311 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wddp7\"" Apr 16 18:34:12.581815 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:12.581795 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kc2vf" Apr 16 18:34:12.696779 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:12.696750 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kc2vf"] Apr 16 18:34:12.699622 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:34:12.699593 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d7d2281_07bb_4906_844c_f53fbfe57143.slice/crio-77a4aa9e509d98f7d9453f7b170e01cab027f7ce07ca2a5b2cd86f93f2e9f0d5 WatchSource:0}: Error finding container 77a4aa9e509d98f7d9453f7b170e01cab027f7ce07ca2a5b2cd86f93f2e9f0d5: Status 404 returned error can't find the container with id 77a4aa9e509d98f7d9453f7b170e01cab027f7ce07ca2a5b2cd86f93f2e9f0d5 Apr 16 18:34:13.485848 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:13.485817 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kc2vf" event={"ID":"1d7d2281-07bb-4906-844c-f53fbfe57143","Type":"ContainerStarted","Data":"77a4aa9e509d98f7d9453f7b170e01cab027f7ce07ca2a5b2cd86f93f2e9f0d5"} Apr 16 18:34:13.502845 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:13.502820 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:14.491023 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:14.490979 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kc2vf" event={"ID":"1d7d2281-07bb-4906-844c-f53fbfe57143","Type":"ContainerStarted","Data":"c86969ce4e843fd95e5de6fb2b12cb207cbfb057d7de4605d8a2f4d8879600d6"} Apr 16 18:34:14.491023 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:14.491028 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kc2vf" event={"ID":"1d7d2281-07bb-4906-844c-f53fbfe57143","Type":"ContainerStarted","Data":"99165ff266cd8623a176fa616581d067977bb429db5d5d29087b8bf5b4c654f5"} Apr 16 18:34:14.511280 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:14.511228 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kc2vf" podStartSLOduration=253.588304338 podStartE2EDuration="4m14.511211476s" podCreationTimestamp="2026-04-16 18:30:00 +0000 UTC" firstStartedPulling="2026-04-16 18:34:12.701323307 +0000 UTC m=+252.419180547" lastFinishedPulling="2026-04-16 18:34:13.624230427 +0000 UTC m=+253.342087685" observedRunningTime="2026-04-16 18:34:14.510165708 +0000 UTC m=+254.228022971" watchObservedRunningTime="2026-04-16 18:34:14.511211476 +0000 UTC m=+254.229068738" Apr 16 18:34:30.597547 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:30.597502 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:34:30.598087 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:30.597993 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="kube-rbac-proxy-web" containerID="cri-o://cd65dd92f42e10208b857bbc650d604dc7d310e2dc4192a5184ebf9bb3fee0f9" gracePeriod=600 Apr 16 18:34:30.598087 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:30.598035 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="kube-rbac-proxy" containerID="cri-o://d2c8c3c3f8017a74408b5bb1a6c7ad3681e4eda7488aa6c54c49a7a34f0757c1" gracePeriod=600 Apr 16 18:34:30.598087 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:30.598036 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="kube-rbac-proxy-thanos" containerID="cri-o://231b4172386e933c2c53fd4cc8685b9546b894bed88471b590fb3b96a9fa49cc" gracePeriod=600 Apr 16 18:34:30.598087 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:30.598031 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="config-reloader" containerID="cri-o://dd2b1962999cf9d49b4e934f9e0ba2679b279a81ee78f7eafc03ef0b82573b15" gracePeriod=600 Apr 16 18:34:30.598314 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:30.597988 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="thanos-sidecar" containerID="cri-o://f94262f5948f333c96c8e7d365b1e0d95df4cdc48c372e2272005d92ad757801" gracePeriod=600 Apr 16 18:34:30.598314 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:30.597952 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="prometheus" containerID="cri-o://2e442cfa946c283a7b813734f4d3399fcf190e904d66b3db75eb1a540d130074" gracePeriod=600 Apr 16 18:34:31.543646 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.543612 2570 generic.go:358] "Generic (PLEG): container finished" podID="ebb74076-3b69-4b37-a937-1210ede643cb" containerID="231b4172386e933c2c53fd4cc8685b9546b894bed88471b590fb3b96a9fa49cc" exitCode=0 Apr 16 18:34:31.543646 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.543637 2570 generic.go:358] "Generic (PLEG): container finished" podID="ebb74076-3b69-4b37-a937-1210ede643cb" containerID="d2c8c3c3f8017a74408b5bb1a6c7ad3681e4eda7488aa6c54c49a7a34f0757c1" exitCode=0 Apr 16 18:34:31.543646 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.543644 2570 generic.go:358] "Generic (PLEG): container finished" podID="ebb74076-3b69-4b37-a937-1210ede643cb" containerID="f94262f5948f333c96c8e7d365b1e0d95df4cdc48c372e2272005d92ad757801" exitCode=0 Apr 16 18:34:31.543646 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.543650 2570 generic.go:358] "Generic (PLEG): container finished" podID="ebb74076-3b69-4b37-a937-1210ede643cb" containerID="dd2b1962999cf9d49b4e934f9e0ba2679b279a81ee78f7eafc03ef0b82573b15" exitCode=0 Apr 16 18:34:31.543915 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.543656 2570 generic.go:358] "Generic (PLEG): container finished" podID="ebb74076-3b69-4b37-a937-1210ede643cb" containerID="2e442cfa946c283a7b813734f4d3399fcf190e904d66b3db75eb1a540d130074" exitCode=0 Apr 16 18:34:31.543915 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.543680 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ebb74076-3b69-4b37-a937-1210ede643cb","Type":"ContainerDied","Data":"231b4172386e933c2c53fd4cc8685b9546b894bed88471b590fb3b96a9fa49cc"} Apr 16 18:34:31.543915 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.543711 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ebb74076-3b69-4b37-a937-1210ede643cb","Type":"ContainerDied","Data":"d2c8c3c3f8017a74408b5bb1a6c7ad3681e4eda7488aa6c54c49a7a34f0757c1"} Apr 16 18:34:31.543915 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.543723 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ebb74076-3b69-4b37-a937-1210ede643cb","Type":"ContainerDied","Data":"f94262f5948f333c96c8e7d365b1e0d95df4cdc48c372e2272005d92ad757801"} Apr 16 18:34:31.543915 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.543735 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ebb74076-3b69-4b37-a937-1210ede643cb","Type":"ContainerDied","Data":"dd2b1962999cf9d49b4e934f9e0ba2679b279a81ee78f7eafc03ef0b82573b15"} Apr 16 18:34:31.543915 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.543746 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ebb74076-3b69-4b37-a937-1210ede643cb","Type":"ContainerDied","Data":"2e442cfa946c283a7b813734f4d3399fcf190e904d66b3db75eb1a540d130074"} Apr 16 18:34:31.837473 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.837447 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:31.985051 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985014 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-prometheus-k8s-rulefiles-0\") pod \"ebb74076-3b69-4b37-a937-1210ede643cb\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " Apr 16 18:34:31.985248 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985068 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-prometheus-k8s-tls\") pod \"ebb74076-3b69-4b37-a937-1210ede643cb\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " Apr 16 18:34:31.985248 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985091 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-metrics-client-certs\") pod \"ebb74076-3b69-4b37-a937-1210ede643cb\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " Apr 16 18:34:31.985248 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985112 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-kube-rbac-proxy\") pod \"ebb74076-3b69-4b37-a937-1210ede643cb\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " Apr 16 18:34:31.985248 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985131 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4mkk\" (UniqueName: \"kubernetes.io/projected/ebb74076-3b69-4b37-a937-1210ede643cb-kube-api-access-z4mkk\") pod \"ebb74076-3b69-4b37-a937-1210ede643cb\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " Apr 16 18:34:31.985464 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985270 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-web-config\") pod \"ebb74076-3b69-4b37-a937-1210ede643cb\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " Apr 16 18:34:31.985464 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985337 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-configmap-metrics-client-ca\") pod \"ebb74076-3b69-4b37-a937-1210ede643cb\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " Apr 16 18:34:31.985464 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985363 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"ebb74076-3b69-4b37-a937-1210ede643cb\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " Apr 16 18:34:31.985464 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985395 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-configmap-kubelet-serving-ca-bundle\") pod \"ebb74076-3b69-4b37-a937-1210ede643cb\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " Apr 16 18:34:31.985464 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985422 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-prometheus-trusted-ca-bundle\") pod \"ebb74076-3b69-4b37-a937-1210ede643cb\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " Apr 16 18:34:31.985464 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985463 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-configmap-serving-certs-ca-bundle\") pod \"ebb74076-3b69-4b37-a937-1210ede643cb\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " Apr 16 18:34:31.985757 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985491 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ebb74076-3b69-4b37-a937-1210ede643cb-prometheus-k8s-db\") pod \"ebb74076-3b69-4b37-a937-1210ede643cb\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " Apr 16 18:34:31.985757 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985523 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ebb74076-3b69-4b37-a937-1210ede643cb-config-out\") pod \"ebb74076-3b69-4b37-a937-1210ede643cb\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " Apr 16 18:34:31.985757 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985575 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-config\") pod \"ebb74076-3b69-4b37-a937-1210ede643cb\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " Apr 16 18:34:31.985757 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985598 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-thanos-prometheus-http-client-file\") pod \"ebb74076-3b69-4b37-a937-1210ede643cb\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " Apr 16 18:34:31.985757 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985630 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"ebb74076-3b69-4b37-a937-1210ede643cb\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " Apr 16 18:34:31.985757 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985670 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-grpc-tls\") pod \"ebb74076-3b69-4b37-a937-1210ede643cb\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " Apr 16 18:34:31.985757 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985696 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ebb74076-3b69-4b37-a937-1210ede643cb-tls-assets\") pod \"ebb74076-3b69-4b37-a937-1210ede643cb\" (UID: \"ebb74076-3b69-4b37-a937-1210ede643cb\") " Apr 16 18:34:31.986126 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.985895 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "ebb74076-3b69-4b37-a937-1210ede643cb" (UID: "ebb74076-3b69-4b37-a937-1210ede643cb"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:31.986126 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.986060 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:34:31.986712 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.986457 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "ebb74076-3b69-4b37-a937-1210ede643cb" (UID: "ebb74076-3b69-4b37-a937-1210ede643cb"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:31.986712 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.986561 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "ebb74076-3b69-4b37-a937-1210ede643cb" (UID: "ebb74076-3b69-4b37-a937-1210ede643cb"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:31.988239 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.988205 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebb74076-3b69-4b37-a937-1210ede643cb-kube-api-access-z4mkk" (OuterVolumeSpecName: "kube-api-access-z4mkk") pod "ebb74076-3b69-4b37-a937-1210ede643cb" (UID: "ebb74076-3b69-4b37-a937-1210ede643cb"). InnerVolumeSpecName "kube-api-access-z4mkk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:34:31.988239 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.988224 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebb74076-3b69-4b37-a937-1210ede643cb-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "ebb74076-3b69-4b37-a937-1210ede643cb" (UID: "ebb74076-3b69-4b37-a937-1210ede643cb"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:34:31.988473 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.988448 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "ebb74076-3b69-4b37-a937-1210ede643cb" (UID: "ebb74076-3b69-4b37-a937-1210ede643cb"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:31.988549 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.988508 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "ebb74076-3b69-4b37-a937-1210ede643cb" (UID: "ebb74076-3b69-4b37-a937-1210ede643cb"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:31.989058 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.988794 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "ebb74076-3b69-4b37-a937-1210ede643cb" (UID: "ebb74076-3b69-4b37-a937-1210ede643cb"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:31.989058 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.988942 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "ebb74076-3b69-4b37-a937-1210ede643cb" (UID: "ebb74076-3b69-4b37-a937-1210ede643cb"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:31.989222 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.989199 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "ebb74076-3b69-4b37-a937-1210ede643cb" (UID: "ebb74076-3b69-4b37-a937-1210ede643cb"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:31.989586 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.989524 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "ebb74076-3b69-4b37-a937-1210ede643cb" (UID: "ebb74076-3b69-4b37-a937-1210ede643cb"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:31.990230 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.990192 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebb74076-3b69-4b37-a937-1210ede643cb-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "ebb74076-3b69-4b37-a937-1210ede643cb" (UID: "ebb74076-3b69-4b37-a937-1210ede643cb"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:34:31.990608 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.990576 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-config" (OuterVolumeSpecName: "config") pod "ebb74076-3b69-4b37-a937-1210ede643cb" (UID: "ebb74076-3b69-4b37-a937-1210ede643cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:31.990694 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.990647 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebb74076-3b69-4b37-a937-1210ede643cb-config-out" (OuterVolumeSpecName: "config-out") pod "ebb74076-3b69-4b37-a937-1210ede643cb" (UID: "ebb74076-3b69-4b37-a937-1210ede643cb"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:34:31.990792 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.990772 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "ebb74076-3b69-4b37-a937-1210ede643cb" (UID: "ebb74076-3b69-4b37-a937-1210ede643cb"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:31.991058 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.991042 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "ebb74076-3b69-4b37-a937-1210ede643cb" (UID: "ebb74076-3b69-4b37-a937-1210ede643cb"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:31.991411 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:31.991396 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "ebb74076-3b69-4b37-a937-1210ede643cb" (UID: "ebb74076-3b69-4b37-a937-1210ede643cb"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:32.000153 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.000126 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-web-config" (OuterVolumeSpecName: "web-config") pod "ebb74076-3b69-4b37-a937-1210ede643cb" (UID: "ebb74076-3b69-4b37-a937-1210ede643cb"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:32.087348 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.087309 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:34:32.087348 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.087344 2570 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-grpc-tls\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:34:32.087348 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.087356 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ebb74076-3b69-4b37-a937-1210ede643cb-tls-assets\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:34:32.087566 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.087365 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:34:32.087566 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.087374 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:34:32.087566 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.087382 2570 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-metrics-client-certs\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:34:32.087566 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.087391 2570 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-kube-rbac-proxy\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:34:32.087566 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.087399 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z4mkk\" (UniqueName: \"kubernetes.io/projected/ebb74076-3b69-4b37-a937-1210ede643cb-kube-api-access-z4mkk\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:34:32.087566 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.087408 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-web-config\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:34:32.087566 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.087415 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-configmap-metrics-client-ca\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:34:32.087566 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.087424 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:34:32.087566 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.087434 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:34:32.087566 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.087443 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebb74076-3b69-4b37-a937-1210ede643cb-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:34:32.087566 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.087451 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ebb74076-3b69-4b37-a937-1210ede643cb-prometheus-k8s-db\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:34:32.087566 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.087459 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ebb74076-3b69-4b37-a937-1210ede643cb-config-out\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:34:32.087566 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.087468 2570 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-config\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:34:32.087566 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.087476 2570 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ebb74076-3b69-4b37-a937-1210ede643cb-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:34:32.549032 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.548942 2570 generic.go:358] "Generic (PLEG): container finished" podID="ebb74076-3b69-4b37-a937-1210ede643cb" containerID="cd65dd92f42e10208b857bbc650d604dc7d310e2dc4192a5184ebf9bb3fee0f9" exitCode=0 Apr 16 18:34:32.549032 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.549013 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ebb74076-3b69-4b37-a937-1210ede643cb","Type":"ContainerDied","Data":"cd65dd92f42e10208b857bbc650d604dc7d310e2dc4192a5184ebf9bb3fee0f9"} Apr 16 18:34:32.549254 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.549059 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ebb74076-3b69-4b37-a937-1210ede643cb","Type":"ContainerDied","Data":"17147f4763ae56490d1f5b63031faeb57781d370ed8c9bca217311992a75ef82"} Apr 16 18:34:32.549254 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.549078 2570 scope.go:117] "RemoveContainer" containerID="231b4172386e933c2c53fd4cc8685b9546b894bed88471b590fb3b96a9fa49cc" Apr 16 18:34:32.549254 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.549078 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.556625 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.556490 2570 scope.go:117] "RemoveContainer" containerID="d2c8c3c3f8017a74408b5bb1a6c7ad3681e4eda7488aa6c54c49a7a34f0757c1" Apr 16 18:34:32.563283 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.563267 2570 scope.go:117] "RemoveContainer" containerID="cd65dd92f42e10208b857bbc650d604dc7d310e2dc4192a5184ebf9bb3fee0f9" Apr 16 18:34:32.569665 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.569645 2570 scope.go:117] "RemoveContainer" containerID="f94262f5948f333c96c8e7d365b1e0d95df4cdc48c372e2272005d92ad757801" Apr 16 18:34:32.576360 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.576335 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:34:32.576675 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.576661 2570 scope.go:117] "RemoveContainer" containerID="dd2b1962999cf9d49b4e934f9e0ba2679b279a81ee78f7eafc03ef0b82573b15" Apr 16 18:34:32.583759 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.583733 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:34:32.584152 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.584132 2570 scope.go:117] "RemoveContainer" containerID="2e442cfa946c283a7b813734f4d3399fcf190e904d66b3db75eb1a540d130074" Apr 16 18:34:32.590724 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.590707 2570 scope.go:117] "RemoveContainer" containerID="39472452e23d40a05b88c7b9349fa086de6c8ce9c19556d65920ac7591593f35" Apr 16 18:34:32.596713 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.596686 2570 scope.go:117] "RemoveContainer" containerID="231b4172386e933c2c53fd4cc8685b9546b894bed88471b590fb3b96a9fa49cc" Apr 16 18:34:32.596948 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:34:32.596931 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"231b4172386e933c2c53fd4cc8685b9546b894bed88471b590fb3b96a9fa49cc\": container with ID starting with 231b4172386e933c2c53fd4cc8685b9546b894bed88471b590fb3b96a9fa49cc not found: ID does not exist" containerID="231b4172386e933c2c53fd4cc8685b9546b894bed88471b590fb3b96a9fa49cc" Apr 16 18:34:32.596996 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.596956 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"231b4172386e933c2c53fd4cc8685b9546b894bed88471b590fb3b96a9fa49cc"} err="failed to get container status \"231b4172386e933c2c53fd4cc8685b9546b894bed88471b590fb3b96a9fa49cc\": rpc error: code = NotFound desc = could not find container \"231b4172386e933c2c53fd4cc8685b9546b894bed88471b590fb3b96a9fa49cc\": container with ID starting with 231b4172386e933c2c53fd4cc8685b9546b894bed88471b590fb3b96a9fa49cc not found: ID does not exist" Apr 16 18:34:32.596996 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.596973 2570 scope.go:117] "RemoveContainer" containerID="d2c8c3c3f8017a74408b5bb1a6c7ad3681e4eda7488aa6c54c49a7a34f0757c1" Apr 16 18:34:32.597177 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:34:32.597160 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2c8c3c3f8017a74408b5bb1a6c7ad3681e4eda7488aa6c54c49a7a34f0757c1\": container with ID starting with d2c8c3c3f8017a74408b5bb1a6c7ad3681e4eda7488aa6c54c49a7a34f0757c1 not found: ID does not exist" containerID="d2c8c3c3f8017a74408b5bb1a6c7ad3681e4eda7488aa6c54c49a7a34f0757c1" Apr 16 18:34:32.597262 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.597179 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2c8c3c3f8017a74408b5bb1a6c7ad3681e4eda7488aa6c54c49a7a34f0757c1"} err="failed to get container status \"d2c8c3c3f8017a74408b5bb1a6c7ad3681e4eda7488aa6c54c49a7a34f0757c1\": rpc error: code = NotFound desc = could not find container \"d2c8c3c3f8017a74408b5bb1a6c7ad3681e4eda7488aa6c54c49a7a34f0757c1\": container with ID starting with d2c8c3c3f8017a74408b5bb1a6c7ad3681e4eda7488aa6c54c49a7a34f0757c1 not found: ID does not exist" Apr 16 18:34:32.597262 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.597192 2570 scope.go:117] "RemoveContainer" containerID="cd65dd92f42e10208b857bbc650d604dc7d310e2dc4192a5184ebf9bb3fee0f9" Apr 16 18:34:32.597408 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:34:32.597393 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd65dd92f42e10208b857bbc650d604dc7d310e2dc4192a5184ebf9bb3fee0f9\": container with ID starting with cd65dd92f42e10208b857bbc650d604dc7d310e2dc4192a5184ebf9bb3fee0f9 not found: ID does not exist" containerID="cd65dd92f42e10208b857bbc650d604dc7d310e2dc4192a5184ebf9bb3fee0f9" Apr 16 18:34:32.597453 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.597411 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd65dd92f42e10208b857bbc650d604dc7d310e2dc4192a5184ebf9bb3fee0f9"} err="failed to get container status \"cd65dd92f42e10208b857bbc650d604dc7d310e2dc4192a5184ebf9bb3fee0f9\": rpc error: code = NotFound desc = could not find container \"cd65dd92f42e10208b857bbc650d604dc7d310e2dc4192a5184ebf9bb3fee0f9\": container with ID starting with cd65dd92f42e10208b857bbc650d604dc7d310e2dc4192a5184ebf9bb3fee0f9 not found: ID does not exist" Apr 16 18:34:32.597453 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.597423 2570 scope.go:117] "RemoveContainer" containerID="f94262f5948f333c96c8e7d365b1e0d95df4cdc48c372e2272005d92ad757801" Apr 16 18:34:32.597653 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:34:32.597635 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f94262f5948f333c96c8e7d365b1e0d95df4cdc48c372e2272005d92ad757801\": container with ID starting with f94262f5948f333c96c8e7d365b1e0d95df4cdc48c372e2272005d92ad757801 not found: ID does not exist" containerID="f94262f5948f333c96c8e7d365b1e0d95df4cdc48c372e2272005d92ad757801" Apr 16 18:34:32.597694 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.597659 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f94262f5948f333c96c8e7d365b1e0d95df4cdc48c372e2272005d92ad757801"} err="failed to get container status \"f94262f5948f333c96c8e7d365b1e0d95df4cdc48c372e2272005d92ad757801\": rpc error: code = NotFound desc = could not find container \"f94262f5948f333c96c8e7d365b1e0d95df4cdc48c372e2272005d92ad757801\": container with ID starting with f94262f5948f333c96c8e7d365b1e0d95df4cdc48c372e2272005d92ad757801 not found: ID does not exist" Apr 16 18:34:32.597694 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.597675 2570 scope.go:117] "RemoveContainer" containerID="dd2b1962999cf9d49b4e934f9e0ba2679b279a81ee78f7eafc03ef0b82573b15" Apr 16 18:34:32.597893 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:34:32.597879 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd2b1962999cf9d49b4e934f9e0ba2679b279a81ee78f7eafc03ef0b82573b15\": container with ID starting with dd2b1962999cf9d49b4e934f9e0ba2679b279a81ee78f7eafc03ef0b82573b15 not found: ID does not exist" containerID="dd2b1962999cf9d49b4e934f9e0ba2679b279a81ee78f7eafc03ef0b82573b15" Apr 16 18:34:32.597957 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.597896 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd2b1962999cf9d49b4e934f9e0ba2679b279a81ee78f7eafc03ef0b82573b15"} err="failed to get container status \"dd2b1962999cf9d49b4e934f9e0ba2679b279a81ee78f7eafc03ef0b82573b15\": rpc error: code = NotFound desc = could not find container \"dd2b1962999cf9d49b4e934f9e0ba2679b279a81ee78f7eafc03ef0b82573b15\": container with ID starting with dd2b1962999cf9d49b4e934f9e0ba2679b279a81ee78f7eafc03ef0b82573b15 not found: ID does not exist" Apr 16 18:34:32.597957 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.597908 2570 scope.go:117] "RemoveContainer" containerID="2e442cfa946c283a7b813734f4d3399fcf190e904d66b3db75eb1a540d130074" Apr 16 18:34:32.598166 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:34:32.598151 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e442cfa946c283a7b813734f4d3399fcf190e904d66b3db75eb1a540d130074\": container with ID starting with 2e442cfa946c283a7b813734f4d3399fcf190e904d66b3db75eb1a540d130074 not found: ID does not exist" containerID="2e442cfa946c283a7b813734f4d3399fcf190e904d66b3db75eb1a540d130074" Apr 16 18:34:32.598208 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.598168 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e442cfa946c283a7b813734f4d3399fcf190e904d66b3db75eb1a540d130074"} err="failed to get container status \"2e442cfa946c283a7b813734f4d3399fcf190e904d66b3db75eb1a540d130074\": rpc error: code = NotFound desc = could not find container \"2e442cfa946c283a7b813734f4d3399fcf190e904d66b3db75eb1a540d130074\": container with ID starting with 2e442cfa946c283a7b813734f4d3399fcf190e904d66b3db75eb1a540d130074 not found: ID does not exist" Apr 16 18:34:32.598208 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.598181 2570 scope.go:117] "RemoveContainer" containerID="39472452e23d40a05b88c7b9349fa086de6c8ce9c19556d65920ac7591593f35" Apr 16 18:34:32.598377 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:34:32.598363 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39472452e23d40a05b88c7b9349fa086de6c8ce9c19556d65920ac7591593f35\": container with ID starting with 39472452e23d40a05b88c7b9349fa086de6c8ce9c19556d65920ac7591593f35 not found: ID does not exist" containerID="39472452e23d40a05b88c7b9349fa086de6c8ce9c19556d65920ac7591593f35" Apr 16 18:34:32.598416 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.598381 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39472452e23d40a05b88c7b9349fa086de6c8ce9c19556d65920ac7591593f35"} err="failed to get container status \"39472452e23d40a05b88c7b9349fa086de6c8ce9c19556d65920ac7591593f35\": rpc error: code = NotFound desc = could not find container \"39472452e23d40a05b88c7b9349fa086de6c8ce9c19556d65920ac7591593f35\": container with ID starting with 39472452e23d40a05b88c7b9349fa086de6c8ce9c19556d65920ac7591593f35 not found: ID does not exist" Apr 16 18:34:32.618498 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618471 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:34:32.618790 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618778 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="prometheus" Apr 16 18:34:32.618834 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618792 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="prometheus" Apr 16 18:34:32.618834 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618800 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="kube-rbac-proxy-web" Apr 16 18:34:32.618834 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618807 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="kube-rbac-proxy-web" Apr 16 18:34:32.618834 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618816 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="init-config-reloader" Apr 16 18:34:32.618834 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618822 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="init-config-reloader" Apr 16 18:34:32.618834 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618829 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="kube-rbac-proxy-thanos" Apr 16 18:34:32.618834 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618835 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="kube-rbac-proxy-thanos" Apr 16 18:34:32.619066 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618842 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="thanos-sidecar" Apr 16 18:34:32.619066 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618848 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="thanos-sidecar" Apr 16 18:34:32.619066 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618857 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77d682f3-5ff1-4f82-b33e-e8723e48e5f9" containerName="registry" Apr 16 18:34:32.619066 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618862 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="77d682f3-5ff1-4f82-b33e-e8723e48e5f9" containerName="registry" Apr 16 18:34:32.619066 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618869 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="config-reloader" Apr 16 18:34:32.619066 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618874 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="config-reloader" Apr 16 18:34:32.619066 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618884 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="kube-rbac-proxy" Apr 16 18:34:32.619066 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618889 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="kube-rbac-proxy" Apr 16 18:34:32.619066 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618955 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="thanos-sidecar" Apr 16 18:34:32.619066 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618964 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="kube-rbac-proxy" Apr 16 18:34:32.619066 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618970 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="77d682f3-5ff1-4f82-b33e-e8723e48e5f9" containerName="registry" Apr 16 18:34:32.619066 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618978 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="prometheus" Apr 16 18:34:32.619066 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618984 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="kube-rbac-proxy-thanos" Apr 16 18:34:32.619066 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618991 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="config-reloader" Apr 16 18:34:32.619066 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.618996 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" containerName="kube-rbac-proxy-web" Apr 16 18:34:32.622583 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.622568 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.626024 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.625757 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:34:32.626024 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.625770 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:34:32.626194 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.626057 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:34:32.626194 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.626092 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:34:32.626297 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.626192 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:34:32.626877 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.626861 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:34:32.627124 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.627103 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:34:32.627226 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.627143 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-6lb3ca33lsk0t\"" Apr 16 18:34:32.627226 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.627146 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:34:32.627414 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.627397 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:34:32.627527 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.627512 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:34:32.627581 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.627541 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:34:32.628613 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.628587 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-245f5\"" Apr 16 18:34:32.630890 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.630869 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:34:32.632174 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.632156 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:34:32.642722 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.642701 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:34:32.691778 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.691744 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ad24bfd7-e54b-42e4-8994-427c01a4fba9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.691778 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.691779 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.691973 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.691799 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad24bfd7-e54b-42e4-8994-427c01a4fba9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.691973 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.691847 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad24bfd7-e54b-42e4-8994-427c01a4fba9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.691973 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.691886 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.691973 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.691951 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.692092 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.691988 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad24bfd7-e54b-42e4-8994-427c01a4fba9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.692092 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.692011 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ad24bfd7-e54b-42e4-8994-427c01a4fba9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.692092 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.692037 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.692092 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.692054 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad24bfd7-e54b-42e4-8994-427c01a4fba9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.692092 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.692086 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.692233 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.692130 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.692233 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.692146 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.692233 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.692165 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-config\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.692233 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.692179 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ad24bfd7-e54b-42e4-8994-427c01a4fba9-config-out\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.692233 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.692194 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq5bc\" (UniqueName: \"kubernetes.io/projected/ad24bfd7-e54b-42e4-8994-427c01a4fba9-kube-api-access-fq5bc\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.692379 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.692232 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-web-config\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.692379 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.692257 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ad24bfd7-e54b-42e4-8994-427c01a4fba9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.775158 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.775123 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebb74076-3b69-4b37-a937-1210ede643cb" path="/var/lib/kubelet/pods/ebb74076-3b69-4b37-a937-1210ede643cb/volumes" Apr 16 18:34:32.792576 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.792549 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ad24bfd7-e54b-42e4-8994-427c01a4fba9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.792681 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.792594 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ad24bfd7-e54b-42e4-8994-427c01a4fba9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.792681 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.792638 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.792681 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.792660 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad24bfd7-e54b-42e4-8994-427c01a4fba9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.792830 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.792681 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad24bfd7-e54b-42e4-8994-427c01a4fba9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.792830 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.792712 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.792830 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.792743 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.793000 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.792856 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad24bfd7-e54b-42e4-8994-427c01a4fba9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.793000 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.792890 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ad24bfd7-e54b-42e4-8994-427c01a4fba9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.793000 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.792961 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.793000 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.792989 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad24bfd7-e54b-42e4-8994-427c01a4fba9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.793196 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.793018 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.793196 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.793068 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.793196 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.793097 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.793196 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.793142 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-config\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.793196 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.793170 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ad24bfd7-e54b-42e4-8994-427c01a4fba9-config-out\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.793428 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.793201 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fq5bc\" (UniqueName: \"kubernetes.io/projected/ad24bfd7-e54b-42e4-8994-427c01a4fba9-kube-api-access-fq5bc\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.793428 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.793246 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-web-config\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.793562 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.793539 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad24bfd7-e54b-42e4-8994-427c01a4fba9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.793620 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.793539 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad24bfd7-e54b-42e4-8994-427c01a4fba9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.795817 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.795735 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ad24bfd7-e54b-42e4-8994-427c01a4fba9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.797028 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.796168 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ad24bfd7-e54b-42e4-8994-427c01a4fba9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.797028 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.796187 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-web-config\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.797028 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.796173 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.797028 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.796309 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.797028 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.796471 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad24bfd7-e54b-42e4-8994-427c01a4fba9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.797028 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.796527 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.797028 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.796715 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.797439 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.797307 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad24bfd7-e54b-42e4-8994-427c01a4fba9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.797439 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.797318 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ad24bfd7-e54b-42e4-8994-427c01a4fba9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.797767 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.797723 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.798021 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.797997 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ad24bfd7-e54b-42e4-8994-427c01a4fba9-config-out\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.798796 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.798769 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.798907 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.798890 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.798980 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.798960 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ad24bfd7-e54b-42e4-8994-427c01a4fba9-config\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.805523 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.805464 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq5bc\" (UniqueName: \"kubernetes.io/projected/ad24bfd7-e54b-42e4-8994-427c01a4fba9-kube-api-access-fq5bc\") pod \"prometheus-k8s-0\" (UID: \"ad24bfd7-e54b-42e4-8994-427c01a4fba9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:32.932968 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:32.932907 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:33.075093 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:33.075059 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:34:33.076378 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:34:33.076351 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad24bfd7_e54b_42e4_8994_427c01a4fba9.slice/crio-879c99ddf718dcc808fe1354a75f0918ed2c5235562d2ad868e6307b2eafaa10 WatchSource:0}: Error finding container 879c99ddf718dcc808fe1354a75f0918ed2c5235562d2ad868e6307b2eafaa10: Status 404 returned error can't find the container with id 879c99ddf718dcc808fe1354a75f0918ed2c5235562d2ad868e6307b2eafaa10 Apr 16 18:34:33.554786 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:33.554705 2570 generic.go:358] "Generic (PLEG): container finished" podID="ad24bfd7-e54b-42e4-8994-427c01a4fba9" containerID="ac914fa5ca7bf06bb5b8d0d47c5d39dff1712e805109ec138aeba1b6c99e5c10" exitCode=0 Apr 16 18:34:33.554786 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:33.554755 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad24bfd7-e54b-42e4-8994-427c01a4fba9","Type":"ContainerDied","Data":"ac914fa5ca7bf06bb5b8d0d47c5d39dff1712e805109ec138aeba1b6c99e5c10"} Apr 16 18:34:33.554786 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:33.554775 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad24bfd7-e54b-42e4-8994-427c01a4fba9","Type":"ContainerStarted","Data":"879c99ddf718dcc808fe1354a75f0918ed2c5235562d2ad868e6307b2eafaa10"} Apr 16 18:34:34.560697 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:34.560663 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad24bfd7-e54b-42e4-8994-427c01a4fba9","Type":"ContainerStarted","Data":"eb24210f8e69688916223f02d717639023a7d0ec798a7166319419ed804153bf"} Apr 16 18:34:34.560697 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:34.560700 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad24bfd7-e54b-42e4-8994-427c01a4fba9","Type":"ContainerStarted","Data":"90368b1cdf368831cbd76067504e564c58dd1e0439d7054a02f53cde75c25fa5"} Apr 16 18:34:34.561106 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:34.560711 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad24bfd7-e54b-42e4-8994-427c01a4fba9","Type":"ContainerStarted","Data":"2d21c02ecf3b8c57a92e52cd2f5b9ad064f628419ef17f2f2fd2716dff519def"} Apr 16 18:34:34.561106 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:34.560720 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad24bfd7-e54b-42e4-8994-427c01a4fba9","Type":"ContainerStarted","Data":"239bdd932a11ae43cbd136bf84ac240328414d3f623365c2d892b6e9333f3b67"} Apr 16 18:34:34.561106 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:34.560728 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad24bfd7-e54b-42e4-8994-427c01a4fba9","Type":"ContainerStarted","Data":"c48945913296e96ad2f6ccc43315298e68a14e8a4fe42dd3cac8474a6f89649c"} Apr 16 18:34:34.561106 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:34.560735 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ad24bfd7-e54b-42e4-8994-427c01a4fba9","Type":"ContainerStarted","Data":"6ddc4910fff518c17e3cbdd101cb06c2a60619a8dd4556c6b74d517acbd24d05"} Apr 16 18:34:34.595456 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:34.595398 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.595385603 podStartE2EDuration="2.595385603s" podCreationTimestamp="2026-04-16 18:34:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:34:34.593714053 +0000 UTC m=+274.311571314" watchObservedRunningTime="2026-04-16 18:34:34.595385603 +0000 UTC m=+274.313242865" Apr 16 18:34:37.933900 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:37.933863 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:40.175013 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:34:40.174959 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-lp728" podUID="8529cef7-b4bb-4d9b-9a9d-cd0b821f2437" Apr 16 18:34:40.175013 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:34:40.174958 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-s87rb" podUID="a25871f6-0ad2-44ac-9f9c-492a30345e0e" Apr 16 18:34:40.580535 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:40.580508 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lp728" Apr 16 18:34:40.580696 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:40.580508 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:34:43.585599 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:43.585557 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert\") pod \"ingress-canary-s87rb\" (UID: \"a25871f6-0ad2-44ac-9f9c-492a30345e0e\") " pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:34:43.586047 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:43.585628 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:34:43.588081 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:43.588049 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8529cef7-b4bb-4d9b-9a9d-cd0b821f2437-metrics-tls\") pod \"dns-default-lp728\" (UID: \"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437\") " pod="openshift-dns/dns-default-lp728" Apr 16 18:34:43.588454 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:43.588432 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a25871f6-0ad2-44ac-9f9c-492a30345e0e-cert\") pod \"ingress-canary-s87rb\" (UID: \"a25871f6-0ad2-44ac-9f9c-492a30345e0e\") " pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:34:43.884683 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:43.884590 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2pvvf\"" Apr 16 18:34:43.885420 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:43.885401 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-h4pkb\"" Apr 16 18:34:43.892672 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:43.892647 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s87rb" Apr 16 18:34:43.892672 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:43.892667 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lp728" Apr 16 18:34:44.021761 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:44.021729 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-s87rb"] Apr 16 18:34:44.025468 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:34:44.025439 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda25871f6_0ad2_44ac_9f9c_492a30345e0e.slice/crio-cac2f603391bdc562eab3b7843be168b391b8b3f246c0a3d3c5527d4abc0a253 WatchSource:0}: Error finding container cac2f603391bdc562eab3b7843be168b391b8b3f246c0a3d3c5527d4abc0a253: Status 404 returned error can't find the container with id cac2f603391bdc562eab3b7843be168b391b8b3f246c0a3d3c5527d4abc0a253 Apr 16 18:34:44.042132 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:44.042112 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lp728"] Apr 16 18:34:44.044001 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:34:44.043974 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8529cef7_b4bb_4d9b_9a9d_cd0b821f2437.slice/crio-22012a37312a2c0c61030490029f714a60478fcf859e71145a55f09adcfc0e9e WatchSource:0}: Error finding container 22012a37312a2c0c61030490029f714a60478fcf859e71145a55f09adcfc0e9e: Status 404 returned error can't find the container with id 22012a37312a2c0c61030490029f714a60478fcf859e71145a55f09adcfc0e9e Apr 16 18:34:44.594743 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:44.594703 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-s87rb" event={"ID":"a25871f6-0ad2-44ac-9f9c-492a30345e0e","Type":"ContainerStarted","Data":"cac2f603391bdc562eab3b7843be168b391b8b3f246c0a3d3c5527d4abc0a253"} Apr 16 18:34:44.596270 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:44.596212 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lp728" event={"ID":"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437","Type":"ContainerStarted","Data":"22012a37312a2c0c61030490029f714a60478fcf859e71145a55f09adcfc0e9e"} Apr 16 18:34:46.603404 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:46.603364 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lp728" event={"ID":"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437","Type":"ContainerStarted","Data":"a02e62fdb780a5a1d8a72e0a3eff8eda34860ab5c415fc483876a512a4fb63fa"} Apr 16 18:34:46.603404 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:46.603411 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lp728" event={"ID":"8529cef7-b4bb-4d9b-9a9d-cd0b821f2437","Type":"ContainerStarted","Data":"58d1ba0d58f87e0eafab414e0f5613c8450b9aa634444e60dbc54db01ef69e26"} Apr 16 18:34:46.603901 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:46.603490 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lp728" Apr 16 18:34:46.604689 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:46.604668 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-s87rb" event={"ID":"a25871f6-0ad2-44ac-9f9c-492a30345e0e","Type":"ContainerStarted","Data":"f15b5829543214368a0e3ce65c86f90ee4cbe11d189a6ff42a5189b6a9851068"} Apr 16 18:34:46.622770 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:46.622727 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lp728" podStartSLOduration=251.859721677 podStartE2EDuration="4m13.622713703s" podCreationTimestamp="2026-04-16 18:30:33 +0000 UTC" firstStartedPulling="2026-04-16 18:34:44.045716981 +0000 UTC m=+283.763574221" lastFinishedPulling="2026-04-16 18:34:45.808709003 +0000 UTC m=+285.526566247" observedRunningTime="2026-04-16 18:34:46.621981175 +0000 UTC m=+286.339838438" watchObservedRunningTime="2026-04-16 18:34:46.622713703 +0000 UTC m=+286.340571033" Apr 16 18:34:46.639116 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:46.639074 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-s87rb" podStartSLOduration=251.854645674 podStartE2EDuration="4m13.639063414s" podCreationTimestamp="2026-04-16 18:30:33 +0000 UTC" firstStartedPulling="2026-04-16 18:34:44.027653853 +0000 UTC m=+283.745511096" lastFinishedPulling="2026-04-16 18:34:45.812071585 +0000 UTC m=+285.529928836" observedRunningTime="2026-04-16 18:34:46.639033977 +0000 UTC m=+286.356891240" watchObservedRunningTime="2026-04-16 18:34:46.639063414 +0000 UTC m=+286.356920676" Apr 16 18:34:56.611099 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:34:56.611069 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lp728" Apr 16 18:35:00.718508 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:35:00.718481 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovn-acl-logging/0.log" Apr 16 18:35:00.718979 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:35:00.718724 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovn-acl-logging/0.log" Apr 16 18:35:00.724380 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:35:00.724362 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:35:32.933625 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:35:32.933590 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:35:32.949014 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:35:32.948989 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:35:33.759526 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:35:33.759499 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:40:00.742332 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:00.742302 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovn-acl-logging/0.log" Apr 16 18:40:00.742829 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:00.742813 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovn-acl-logging/0.log" Apr 16 18:40:38.243517 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:38.243480 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-2xdrd"] Apr 16 18:40:38.246773 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:38.246756 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-2xdrd" Apr 16 18:40:38.249671 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:38.249646 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:40:38.249763 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:38.249745 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:40:38.251050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:38.251030 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-6bjvb\"" Apr 16 18:40:38.251050 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:38.251048 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:40:38.259682 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:38.259662 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-2xdrd"] Apr 16 18:40:38.320668 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:38.320632 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/913bb92b-1184-4a71-85de-86f726afd42f-data\") pod \"seaweedfs-86cc847c5c-2xdrd\" (UID: \"913bb92b-1184-4a71-85de-86f726afd42f\") " pod="kserve/seaweedfs-86cc847c5c-2xdrd" Apr 16 18:40:38.320838 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:38.320701 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbnms\" (UniqueName: \"kubernetes.io/projected/913bb92b-1184-4a71-85de-86f726afd42f-kube-api-access-gbnms\") pod \"seaweedfs-86cc847c5c-2xdrd\" (UID: \"913bb92b-1184-4a71-85de-86f726afd42f\") " pod="kserve/seaweedfs-86cc847c5c-2xdrd" Apr 16 18:40:38.422128 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:38.422096 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbnms\" (UniqueName: \"kubernetes.io/projected/913bb92b-1184-4a71-85de-86f726afd42f-kube-api-access-gbnms\") pod \"seaweedfs-86cc847c5c-2xdrd\" (UID: \"913bb92b-1184-4a71-85de-86f726afd42f\") " pod="kserve/seaweedfs-86cc847c5c-2xdrd" Apr 16 18:40:38.422291 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:38.422168 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/913bb92b-1184-4a71-85de-86f726afd42f-data\") pod \"seaweedfs-86cc847c5c-2xdrd\" (UID: \"913bb92b-1184-4a71-85de-86f726afd42f\") " pod="kserve/seaweedfs-86cc847c5c-2xdrd" Apr 16 18:40:38.422487 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:38.422472 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/913bb92b-1184-4a71-85de-86f726afd42f-data\") pod \"seaweedfs-86cc847c5c-2xdrd\" (UID: \"913bb92b-1184-4a71-85de-86f726afd42f\") " pod="kserve/seaweedfs-86cc847c5c-2xdrd" Apr 16 18:40:38.431934 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:38.431879 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbnms\" (UniqueName: \"kubernetes.io/projected/913bb92b-1184-4a71-85de-86f726afd42f-kube-api-access-gbnms\") pod \"seaweedfs-86cc847c5c-2xdrd\" (UID: \"913bb92b-1184-4a71-85de-86f726afd42f\") " pod="kserve/seaweedfs-86cc847c5c-2xdrd" Apr 16 18:40:38.556794 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:38.556701 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-2xdrd" Apr 16 18:40:38.699138 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:38.699108 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-2xdrd"] Apr 16 18:40:38.700357 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:40:38.700328 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod913bb92b_1184_4a71_85de_86f726afd42f.slice/crio-c03fa18aedd8da38b214d0524e7ad492267099bb8cd9d1adb1fd7664879d2410 WatchSource:0}: Error finding container c03fa18aedd8da38b214d0524e7ad492267099bb8cd9d1adb1fd7664879d2410: Status 404 returned error can't find the container with id c03fa18aedd8da38b214d0524e7ad492267099bb8cd9d1adb1fd7664879d2410 Apr 16 18:40:38.701567 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:38.701550 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:40:39.610187 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:39.610146 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-2xdrd" event={"ID":"913bb92b-1184-4a71-85de-86f726afd42f","Type":"ContainerStarted","Data":"c03fa18aedd8da38b214d0524e7ad492267099bb8cd9d1adb1fd7664879d2410"} Apr 16 18:40:41.616764 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:41.616733 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-2xdrd" event={"ID":"913bb92b-1184-4a71-85de-86f726afd42f","Type":"ContainerStarted","Data":"2e738b6673a551e2f1db1d6b37d9113553c6e7f55245b8d780ba4980d059defa"} Apr 16 18:40:41.617209 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:41.616793 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-2xdrd" Apr 16 18:40:41.636288 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:41.636235 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-2xdrd" podStartSLOduration=1.14793737 podStartE2EDuration="3.636219681s" podCreationTimestamp="2026-04-16 18:40:38 +0000 UTC" firstStartedPulling="2026-04-16 18:40:38.701675365 +0000 UTC m=+638.419532607" lastFinishedPulling="2026-04-16 18:40:41.189957663 +0000 UTC m=+640.907814918" observedRunningTime="2026-04-16 18:40:41.635297224 +0000 UTC m=+641.353154487" watchObservedRunningTime="2026-04-16 18:40:41.636219681 +0000 UTC m=+641.354076942" Apr 16 18:40:47.622720 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:40:47.622685 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-2xdrd" Apr 16 18:41:47.319720 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:41:47.319644 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-jghww"] Apr 16 18:41:47.322047 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:41:47.322023 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-jghww" Apr 16 18:41:47.324885 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:41:47.324862 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 18:41:47.325005 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:41:47.324989 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-7cndd\"" Apr 16 18:41:47.334963 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:41:47.334914 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-jghww"] Apr 16 18:41:47.386408 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:41:47.386376 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgqrk\" (UniqueName: \"kubernetes.io/projected/4ee0c56e-b9f3-4b8a-9751-08862dff1dcd-kube-api-access-lgqrk\") pod \"odh-model-controller-696fc77849-jghww\" (UID: \"4ee0c56e-b9f3-4b8a-9751-08862dff1dcd\") " pod="kserve/odh-model-controller-696fc77849-jghww" Apr 16 18:41:47.386408 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:41:47.386413 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ee0c56e-b9f3-4b8a-9751-08862dff1dcd-cert\") pod \"odh-model-controller-696fc77849-jghww\" (UID: \"4ee0c56e-b9f3-4b8a-9751-08862dff1dcd\") " pod="kserve/odh-model-controller-696fc77849-jghww" Apr 16 18:41:47.487419 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:41:47.487385 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgqrk\" (UniqueName: \"kubernetes.io/projected/4ee0c56e-b9f3-4b8a-9751-08862dff1dcd-kube-api-access-lgqrk\") pod \"odh-model-controller-696fc77849-jghww\" (UID: \"4ee0c56e-b9f3-4b8a-9751-08862dff1dcd\") " pod="kserve/odh-model-controller-696fc77849-jghww" Apr 16 18:41:47.487419 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:41:47.487421 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ee0c56e-b9f3-4b8a-9751-08862dff1dcd-cert\") pod \"odh-model-controller-696fc77849-jghww\" (UID: \"4ee0c56e-b9f3-4b8a-9751-08862dff1dcd\") " pod="kserve/odh-model-controller-696fc77849-jghww" Apr 16 18:41:47.487623 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:41:47.487533 2570 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 18:41:47.487623 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:41:47.487594 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ee0c56e-b9f3-4b8a-9751-08862dff1dcd-cert podName:4ee0c56e-b9f3-4b8a-9751-08862dff1dcd nodeName:}" failed. No retries permitted until 2026-04-16 18:41:47.987578401 +0000 UTC m=+707.705435640 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4ee0c56e-b9f3-4b8a-9751-08862dff1dcd-cert") pod "odh-model-controller-696fc77849-jghww" (UID: "4ee0c56e-b9f3-4b8a-9751-08862dff1dcd") : secret "odh-model-controller-webhook-cert" not found Apr 16 18:41:47.497358 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:41:47.497335 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgqrk\" (UniqueName: \"kubernetes.io/projected/4ee0c56e-b9f3-4b8a-9751-08862dff1dcd-kube-api-access-lgqrk\") pod \"odh-model-controller-696fc77849-jghww\" (UID: \"4ee0c56e-b9f3-4b8a-9751-08862dff1dcd\") " pod="kserve/odh-model-controller-696fc77849-jghww" Apr 16 18:41:47.990820 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:41:47.990780 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ee0c56e-b9f3-4b8a-9751-08862dff1dcd-cert\") pod \"odh-model-controller-696fc77849-jghww\" (UID: \"4ee0c56e-b9f3-4b8a-9751-08862dff1dcd\") " pod="kserve/odh-model-controller-696fc77849-jghww" Apr 16 18:41:47.993123 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:41:47.993105 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ee0c56e-b9f3-4b8a-9751-08862dff1dcd-cert\") pod \"odh-model-controller-696fc77849-jghww\" (UID: \"4ee0c56e-b9f3-4b8a-9751-08862dff1dcd\") " pod="kserve/odh-model-controller-696fc77849-jghww" Apr 16 18:41:48.232694 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:41:48.232655 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-jghww" Apr 16 18:41:48.355607 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:41:48.355581 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-jghww"] Apr 16 18:41:48.358199 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:41:48.358158 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ee0c56e_b9f3_4b8a_9751_08862dff1dcd.slice/crio-d4f793952ac57e3e29b389d44c11d32d6790d48352f87ddd1e79c13e14c3a864 WatchSource:0}: Error finding container d4f793952ac57e3e29b389d44c11d32d6790d48352f87ddd1e79c13e14c3a864: Status 404 returned error can't find the container with id d4f793952ac57e3e29b389d44c11d32d6790d48352f87ddd1e79c13e14c3a864 Apr 16 18:41:48.808685 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:41:48.808653 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-jghww" event={"ID":"4ee0c56e-b9f3-4b8a-9751-08862dff1dcd","Type":"ContainerStarted","Data":"d4f793952ac57e3e29b389d44c11d32d6790d48352f87ddd1e79c13e14c3a864"} Apr 16 18:41:51.824198 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:41:51.824157 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-jghww" event={"ID":"4ee0c56e-b9f3-4b8a-9751-08862dff1dcd","Type":"ContainerStarted","Data":"2c4c0056ec300c098abb7bd8d3efb327d9d7f43fedfd19748aefa8d435f54d21"} Apr 16 18:41:51.824664 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:41:51.824273 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-jghww" Apr 16 18:41:51.841718 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:41:51.841665 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-jghww" podStartSLOduration=2.388426388 podStartE2EDuration="4.841650988s" podCreationTimestamp="2026-04-16 18:41:47 +0000 UTC" firstStartedPulling="2026-04-16 18:41:48.359417898 +0000 UTC m=+708.077275138" lastFinishedPulling="2026-04-16 18:41:50.812642488 +0000 UTC m=+710.530499738" observedRunningTime="2026-04-16 18:41:51.84036663 +0000 UTC m=+711.558223895" watchObservedRunningTime="2026-04-16 18:41:51.841650988 +0000 UTC m=+711.559508249" Apr 16 18:42:02.829479 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:02.829449 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-jghww" Apr 16 18:42:03.666512 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:03.666478 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-w22xc"] Apr 16 18:42:03.668909 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:03.668892 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-w22xc" Apr 16 18:42:03.677301 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:03.677277 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-w22xc"] Apr 16 18:42:03.726558 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:03.726528 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nm47\" (UniqueName: \"kubernetes.io/projected/b25b7418-bcf6-451c-a994-75c6c69740b2-kube-api-access-7nm47\") pod \"s3-init-w22xc\" (UID: \"b25b7418-bcf6-451c-a994-75c6c69740b2\") " pod="kserve/s3-init-w22xc" Apr 16 18:42:03.827820 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:03.827786 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nm47\" (UniqueName: \"kubernetes.io/projected/b25b7418-bcf6-451c-a994-75c6c69740b2-kube-api-access-7nm47\") pod \"s3-init-w22xc\" (UID: \"b25b7418-bcf6-451c-a994-75c6c69740b2\") " pod="kserve/s3-init-w22xc" Apr 16 18:42:03.837613 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:03.837582 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nm47\" (UniqueName: \"kubernetes.io/projected/b25b7418-bcf6-451c-a994-75c6c69740b2-kube-api-access-7nm47\") pod \"s3-init-w22xc\" (UID: \"b25b7418-bcf6-451c-a994-75c6c69740b2\") " pod="kserve/s3-init-w22xc" Apr 16 18:42:03.991000 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:03.990881 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-w22xc" Apr 16 18:42:04.112445 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:04.112419 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-w22xc"] Apr 16 18:42:04.114666 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:42:04.114639 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb25b7418_bcf6_451c_a994_75c6c69740b2.slice/crio-b55e2efed2f9d2ee77f0fecd2e87550bc854ea463fcdd2d366e266258c8f830e WatchSource:0}: Error finding container b55e2efed2f9d2ee77f0fecd2e87550bc854ea463fcdd2d366e266258c8f830e: Status 404 returned error can't find the container with id b55e2efed2f9d2ee77f0fecd2e87550bc854ea463fcdd2d366e266258c8f830e Apr 16 18:42:04.869808 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:04.869764 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-w22xc" event={"ID":"b25b7418-bcf6-451c-a994-75c6c69740b2","Type":"ContainerStarted","Data":"b55e2efed2f9d2ee77f0fecd2e87550bc854ea463fcdd2d366e266258c8f830e"} Apr 16 18:42:08.883512 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:08.883464 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-w22xc" event={"ID":"b25b7418-bcf6-451c-a994-75c6c69740b2","Type":"ContainerStarted","Data":"a2d45639731b5d911fec71d75abbe8495f12284498200403b5b84d06668811f2"} Apr 16 18:42:08.903172 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:08.903110 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-w22xc" podStartSLOduration=1.545252016 podStartE2EDuration="5.90309119s" podCreationTimestamp="2026-04-16 18:42:03 +0000 UTC" firstStartedPulling="2026-04-16 18:42:04.116619591 +0000 UTC m=+723.834476845" lastFinishedPulling="2026-04-16 18:42:08.474458776 +0000 UTC m=+728.192316019" observedRunningTime="2026-04-16 18:42:08.900649024 +0000 UTC m=+728.618506286" watchObservedRunningTime="2026-04-16 18:42:08.90309119 +0000 UTC m=+728.620948452" Apr 16 18:42:11.892428 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:11.892335 2570 generic.go:358] "Generic (PLEG): container finished" podID="b25b7418-bcf6-451c-a994-75c6c69740b2" containerID="a2d45639731b5d911fec71d75abbe8495f12284498200403b5b84d06668811f2" exitCode=0 Apr 16 18:42:11.892428 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:11.892405 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-w22xc" event={"ID":"b25b7418-bcf6-451c-a994-75c6c69740b2","Type":"ContainerDied","Data":"a2d45639731b5d911fec71d75abbe8495f12284498200403b5b84d06668811f2"} Apr 16 18:42:13.016213 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:13.016191 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-w22xc" Apr 16 18:42:13.109886 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:13.109850 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nm47\" (UniqueName: \"kubernetes.io/projected/b25b7418-bcf6-451c-a994-75c6c69740b2-kube-api-access-7nm47\") pod \"b25b7418-bcf6-451c-a994-75c6c69740b2\" (UID: \"b25b7418-bcf6-451c-a994-75c6c69740b2\") " Apr 16 18:42:13.112089 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:13.112062 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b25b7418-bcf6-451c-a994-75c6c69740b2-kube-api-access-7nm47" (OuterVolumeSpecName: "kube-api-access-7nm47") pod "b25b7418-bcf6-451c-a994-75c6c69740b2" (UID: "b25b7418-bcf6-451c-a994-75c6c69740b2"). InnerVolumeSpecName "kube-api-access-7nm47". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:42:13.210982 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:13.210870 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7nm47\" (UniqueName: \"kubernetes.io/projected/b25b7418-bcf6-451c-a994-75c6c69740b2-kube-api-access-7nm47\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:42:13.899472 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:13.899440 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-w22xc" event={"ID":"b25b7418-bcf6-451c-a994-75c6c69740b2","Type":"ContainerDied","Data":"b55e2efed2f9d2ee77f0fecd2e87550bc854ea463fcdd2d366e266258c8f830e"} Apr 16 18:42:13.899472 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:13.899456 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-w22xc" Apr 16 18:42:13.899472 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:42:13.899472 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b55e2efed2f9d2ee77f0fecd2e87550bc854ea463fcdd2d366e266258c8f830e" Apr 16 18:45:00.769392 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:45:00.769362 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovn-acl-logging/0.log" Apr 16 18:45:00.769876 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:45:00.769777 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovn-acl-logging/0.log" Apr 16 18:50:00.793087 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:50:00.793061 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovn-acl-logging/0.log" Apr 16 18:50:00.794194 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:50:00.794172 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovn-acl-logging/0.log" Apr 16 18:55:00.814498 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:00.814469 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovn-acl-logging/0.log" Apr 16 18:55:00.816505 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:00.816484 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovn-acl-logging/0.log" Apr 16 18:55:47.335005 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:47.334968 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9z9wg/must-gather-kgmpk"] Apr 16 18:55:47.335599 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:47.335444 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b25b7418-bcf6-451c-a994-75c6c69740b2" containerName="s3-init" Apr 16 18:55:47.335599 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:47.335463 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="b25b7418-bcf6-451c-a994-75c6c69740b2" containerName="s3-init" Apr 16 18:55:47.335599 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:47.335551 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="b25b7418-bcf6-451c-a994-75c6c69740b2" containerName="s3-init" Apr 16 18:55:47.338652 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:47.338633 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9z9wg/must-gather-kgmpk" Apr 16 18:55:47.341309 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:47.341290 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9z9wg\"/\"kube-root-ca.crt\"" Apr 16 18:55:47.341427 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:47.341295 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9z9wg\"/\"openshift-service-ca.crt\"" Apr 16 18:55:47.348626 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:47.348080 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-9z9wg\"/\"default-dockercfg-v2h9s\"" Apr 16 18:55:47.351173 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:47.351146 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9z9wg/must-gather-kgmpk"] Apr 16 18:55:47.475826 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:47.475791 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/58195199-f074-44ab-9691-c5a07ca271ec-must-gather-output\") pod \"must-gather-kgmpk\" (UID: \"58195199-f074-44ab-9691-c5a07ca271ec\") " pod="openshift-must-gather-9z9wg/must-gather-kgmpk" Apr 16 18:55:47.475826 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:47.475828 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4zmg\" (UniqueName: \"kubernetes.io/projected/58195199-f074-44ab-9691-c5a07ca271ec-kube-api-access-s4zmg\") pod \"must-gather-kgmpk\" (UID: \"58195199-f074-44ab-9691-c5a07ca271ec\") " pod="openshift-must-gather-9z9wg/must-gather-kgmpk" Apr 16 18:55:47.577139 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:47.577097 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/58195199-f074-44ab-9691-c5a07ca271ec-must-gather-output\") pod \"must-gather-kgmpk\" (UID: \"58195199-f074-44ab-9691-c5a07ca271ec\") " pod="openshift-must-gather-9z9wg/must-gather-kgmpk" Apr 16 18:55:47.577139 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:47.577140 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4zmg\" (UniqueName: \"kubernetes.io/projected/58195199-f074-44ab-9691-c5a07ca271ec-kube-api-access-s4zmg\") pod \"must-gather-kgmpk\" (UID: \"58195199-f074-44ab-9691-c5a07ca271ec\") " pod="openshift-must-gather-9z9wg/must-gather-kgmpk" Apr 16 18:55:47.577466 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:47.577445 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/58195199-f074-44ab-9691-c5a07ca271ec-must-gather-output\") pod \"must-gather-kgmpk\" (UID: \"58195199-f074-44ab-9691-c5a07ca271ec\") " pod="openshift-must-gather-9z9wg/must-gather-kgmpk" Apr 16 18:55:47.586188 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:47.586129 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4zmg\" (UniqueName: \"kubernetes.io/projected/58195199-f074-44ab-9691-c5a07ca271ec-kube-api-access-s4zmg\") pod \"must-gather-kgmpk\" (UID: \"58195199-f074-44ab-9691-c5a07ca271ec\") " pod="openshift-must-gather-9z9wg/must-gather-kgmpk" Apr 16 18:55:47.667179 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:47.667141 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9z9wg/must-gather-kgmpk" Apr 16 18:55:47.785045 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:47.785016 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9z9wg/must-gather-kgmpk"] Apr 16 18:55:47.787875 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:55:47.787845 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58195199_f074_44ab_9691_c5a07ca271ec.slice/crio-8336d0ac728c795ec8a890a69592d9d4436855f43e946b108a259ae879f50b91 WatchSource:0}: Error finding container 8336d0ac728c795ec8a890a69592d9d4436855f43e946b108a259ae879f50b91: Status 404 returned error can't find the container with id 8336d0ac728c795ec8a890a69592d9d4436855f43e946b108a259ae879f50b91 Apr 16 18:55:47.789451 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:47.789436 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:55:48.306225 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:48.306194 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9z9wg/must-gather-kgmpk" event={"ID":"58195199-f074-44ab-9691-c5a07ca271ec","Type":"ContainerStarted","Data":"8336d0ac728c795ec8a890a69592d9d4436855f43e946b108a259ae879f50b91"} Apr 16 18:55:52.320288 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:52.320253 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9z9wg/must-gather-kgmpk" event={"ID":"58195199-f074-44ab-9691-c5a07ca271ec","Type":"ContainerStarted","Data":"6172bb87e93f771958050d97f8007bd430b2ac05a9b0dec64e900719840c4f2d"} Apr 16 18:55:52.320288 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:52.320288 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9z9wg/must-gather-kgmpk" event={"ID":"58195199-f074-44ab-9691-c5a07ca271ec","Type":"ContainerStarted","Data":"a966c14355fd909df09ad7fc1ee578fc5c92ec054556e5f316be223b27426cee"} Apr 16 18:55:52.338724 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:55:52.338668 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9z9wg/must-gather-kgmpk" podStartSLOduration=1.337087427 podStartE2EDuration="5.33865285s" podCreationTimestamp="2026-04-16 18:55:47 +0000 UTC" firstStartedPulling="2026-04-16 18:55:47.789564856 +0000 UTC m=+1547.507422096" lastFinishedPulling="2026-04-16 18:55:51.791130275 +0000 UTC m=+1551.508987519" observedRunningTime="2026-04-16 18:55:52.337151213 +0000 UTC m=+1552.055008474" watchObservedRunningTime="2026-04-16 18:55:52.33865285 +0000 UTC m=+1552.056510111" Apr 16 18:56:09.376641 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:09.376605 2570 generic.go:358] "Generic (PLEG): container finished" podID="58195199-f074-44ab-9691-c5a07ca271ec" containerID="a966c14355fd909df09ad7fc1ee578fc5c92ec054556e5f316be223b27426cee" exitCode=0 Apr 16 18:56:09.377070 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:09.376678 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9z9wg/must-gather-kgmpk" event={"ID":"58195199-f074-44ab-9691-c5a07ca271ec","Type":"ContainerDied","Data":"a966c14355fd909df09ad7fc1ee578fc5c92ec054556e5f316be223b27426cee"} Apr 16 18:56:09.377070 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:09.377025 2570 scope.go:117] "RemoveContainer" containerID="a966c14355fd909df09ad7fc1ee578fc5c92ec054556e5f316be223b27426cee" Apr 16 18:56:10.220558 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:10.220532 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9z9wg_must-gather-kgmpk_58195199-f074-44ab-9691-c5a07ca271ec/gather/0.log" Apr 16 18:56:13.527707 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:13.527670 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-cqphb_79a42a16-0e76-463a-87f0-ca53a4f24aa2/global-pull-secret-syncer/0.log" Apr 16 18:56:13.674039 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:13.674012 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-dqknm_42dd4370-c71a-4351-88f5-8f1f146f3846/konnectivity-agent/0.log" Apr 16 18:56:13.726149 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:13.726119 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-166.ec2.internal_ca19b534e720fdd7bc90ad9dfbc6cf32/haproxy/0.log" Apr 16 18:56:15.590577 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:15.590537 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9z9wg/must-gather-kgmpk"] Apr 16 18:56:15.590995 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:15.590775 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-9z9wg/must-gather-kgmpk" podUID="58195199-f074-44ab-9691-c5a07ca271ec" containerName="copy" containerID="cri-o://6172bb87e93f771958050d97f8007bd430b2ac05a9b0dec64e900719840c4f2d" gracePeriod=2 Apr 16 18:56:15.596908 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:15.596880 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9z9wg/must-gather-kgmpk"] Apr 16 18:56:15.806703 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:15.806679 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9z9wg_must-gather-kgmpk_58195199-f074-44ab-9691-c5a07ca271ec/copy/0.log" Apr 16 18:56:15.807037 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:15.807021 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9z9wg/must-gather-kgmpk" Apr 16 18:56:15.809254 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:15.809233 2570 status_manager.go:895] "Failed to get status for pod" podUID="58195199-f074-44ab-9691-c5a07ca271ec" pod="openshift-must-gather-9z9wg/must-gather-kgmpk" err="pods \"must-gather-kgmpk\" is forbidden: User \"system:node:ip-10-0-129-166.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-9z9wg\": no relationship found between node 'ip-10-0-129-166.ec2.internal' and this object" Apr 16 18:56:15.898786 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:15.898719 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4zmg\" (UniqueName: \"kubernetes.io/projected/58195199-f074-44ab-9691-c5a07ca271ec-kube-api-access-s4zmg\") pod \"58195199-f074-44ab-9691-c5a07ca271ec\" (UID: \"58195199-f074-44ab-9691-c5a07ca271ec\") " Apr 16 18:56:15.898881 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:15.898822 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/58195199-f074-44ab-9691-c5a07ca271ec-must-gather-output\") pod \"58195199-f074-44ab-9691-c5a07ca271ec\" (UID: \"58195199-f074-44ab-9691-c5a07ca271ec\") " Apr 16 18:56:15.900166 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:15.900133 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58195199-f074-44ab-9691-c5a07ca271ec-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "58195199-f074-44ab-9691-c5a07ca271ec" (UID: "58195199-f074-44ab-9691-c5a07ca271ec"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:56:15.900890 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:15.900866 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58195199-f074-44ab-9691-c5a07ca271ec-kube-api-access-s4zmg" (OuterVolumeSpecName: "kube-api-access-s4zmg") pod "58195199-f074-44ab-9691-c5a07ca271ec" (UID: "58195199-f074-44ab-9691-c5a07ca271ec"). InnerVolumeSpecName "kube-api-access-s4zmg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:56:15.999728 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:15.999696 2570 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/58195199-f074-44ab-9691-c5a07ca271ec-must-gather-output\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:56:15.999728 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:15.999723 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s4zmg\" (UniqueName: \"kubernetes.io/projected/58195199-f074-44ab-9691-c5a07ca271ec-kube-api-access-s4zmg\") on node \"ip-10-0-129-166.ec2.internal\" DevicePath \"\"" Apr 16 18:56:16.401579 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:16.401553 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9z9wg_must-gather-kgmpk_58195199-f074-44ab-9691-c5a07ca271ec/copy/0.log" Apr 16 18:56:16.401871 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:16.401846 2570 generic.go:358] "Generic (PLEG): container finished" podID="58195199-f074-44ab-9691-c5a07ca271ec" containerID="6172bb87e93f771958050d97f8007bd430b2ac05a9b0dec64e900719840c4f2d" exitCode=143 Apr 16 18:56:16.401938 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:16.401902 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9z9wg/must-gather-kgmpk" Apr 16 18:56:16.401990 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:16.401966 2570 scope.go:117] "RemoveContainer" containerID="6172bb87e93f771958050d97f8007bd430b2ac05a9b0dec64e900719840c4f2d" Apr 16 18:56:16.404728 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:16.404703 2570 status_manager.go:895] "Failed to get status for pod" podUID="58195199-f074-44ab-9691-c5a07ca271ec" pod="openshift-must-gather-9z9wg/must-gather-kgmpk" err="pods \"must-gather-kgmpk\" is forbidden: User \"system:node:ip-10-0-129-166.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-9z9wg\": no relationship found between node 'ip-10-0-129-166.ec2.internal' and this object" Apr 16 18:56:16.409658 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:16.409639 2570 scope.go:117] "RemoveContainer" containerID="a966c14355fd909df09ad7fc1ee578fc5c92ec054556e5f316be223b27426cee" Apr 16 18:56:16.412063 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:16.412039 2570 status_manager.go:895] "Failed to get status for pod" podUID="58195199-f074-44ab-9691-c5a07ca271ec" pod="openshift-must-gather-9z9wg/must-gather-kgmpk" err="pods \"must-gather-kgmpk\" is forbidden: User \"system:node:ip-10-0-129-166.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-9z9wg\": no relationship found between node 'ip-10-0-129-166.ec2.internal' and this object" Apr 16 18:56:16.420620 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:16.420606 2570 scope.go:117] "RemoveContainer" containerID="6172bb87e93f771958050d97f8007bd430b2ac05a9b0dec64e900719840c4f2d" Apr 16 18:56:16.420889 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:56:16.420863 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6172bb87e93f771958050d97f8007bd430b2ac05a9b0dec64e900719840c4f2d\": container with ID starting with 6172bb87e93f771958050d97f8007bd430b2ac05a9b0dec64e900719840c4f2d not found: ID does not exist" containerID="6172bb87e93f771958050d97f8007bd430b2ac05a9b0dec64e900719840c4f2d" Apr 16 18:56:16.420949 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:16.420897 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6172bb87e93f771958050d97f8007bd430b2ac05a9b0dec64e900719840c4f2d"} err="failed to get container status \"6172bb87e93f771958050d97f8007bd430b2ac05a9b0dec64e900719840c4f2d\": rpc error: code = NotFound desc = could not find container \"6172bb87e93f771958050d97f8007bd430b2ac05a9b0dec64e900719840c4f2d\": container with ID starting with 6172bb87e93f771958050d97f8007bd430b2ac05a9b0dec64e900719840c4f2d not found: ID does not exist" Apr 16 18:56:16.421031 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:16.421017 2570 scope.go:117] "RemoveContainer" containerID="a966c14355fd909df09ad7fc1ee578fc5c92ec054556e5f316be223b27426cee" Apr 16 18:56:16.421271 ip-10-0-129-166 kubenswrapper[2570]: E0416 18:56:16.421242 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a966c14355fd909df09ad7fc1ee578fc5c92ec054556e5f316be223b27426cee\": container with ID starting with a966c14355fd909df09ad7fc1ee578fc5c92ec054556e5f316be223b27426cee not found: ID does not exist" containerID="a966c14355fd909df09ad7fc1ee578fc5c92ec054556e5f316be223b27426cee" Apr 16 18:56:16.421314 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:16.421272 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a966c14355fd909df09ad7fc1ee578fc5c92ec054556e5f316be223b27426cee"} err="failed to get container status \"a966c14355fd909df09ad7fc1ee578fc5c92ec054556e5f316be223b27426cee\": rpc error: code = NotFound desc = could not find container \"a966c14355fd909df09ad7fc1ee578fc5c92ec054556e5f316be223b27426cee\": container with ID starting with a966c14355fd909df09ad7fc1ee578fc5c92ec054556e5f316be223b27426cee not found: ID does not exist" Apr 16 18:56:16.776398 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:16.776302 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58195199-f074-44ab-9691-c5a07ca271ec" path="/var/lib/kubelet/pods/58195199-f074-44ab-9691-c5a07ca271ec/volumes" Apr 16 18:56:17.169378 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.169351 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-24f2g_2b58bbe1-c23a-4746-87d2-0f22a039027a/cluster-monitoring-operator/0.log" Apr 16 18:56:17.201140 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.201109 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-6lb8r_953954e4-b723-467d-afcb-ae1b158de42e/kube-state-metrics/0.log" Apr 16 18:56:17.232575 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.232547 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-6lb8r_953954e4-b723-467d-afcb-ae1b158de42e/kube-rbac-proxy-main/0.log" Apr 16 18:56:17.266503 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.266472 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-6lb8r_953954e4-b723-467d-afcb-ae1b158de42e/kube-rbac-proxy-self/0.log" Apr 16 18:56:17.330581 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.330503 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-llkbq_0640a4da-0593-4262-9936-a12646e54d28/monitoring-plugin/0.log" Apr 16 18:56:17.368640 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.368612 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5sb9j_eacdc74c-f6bb-4fce-a567-ab934f47b2c9/node-exporter/0.log" Apr 16 18:56:17.394234 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.394200 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5sb9j_eacdc74c-f6bb-4fce-a567-ab934f47b2c9/kube-rbac-proxy/0.log" Apr 16 18:56:17.432984 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.432957 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5sb9j_eacdc74c-f6bb-4fce-a567-ab934f47b2c9/init-textfile/0.log" Apr 16 18:56:17.646092 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.646011 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-nf9vc_b8b92090-4fd9-4b07-973e-79e7cd43cfb5/kube-rbac-proxy-main/0.log" Apr 16 18:56:17.669721 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.669695 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-nf9vc_b8b92090-4fd9-4b07-973e-79e7cd43cfb5/kube-rbac-proxy-self/0.log" Apr 16 18:56:17.692206 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.692181 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-nf9vc_b8b92090-4fd9-4b07-973e-79e7cd43cfb5/openshift-state-metrics/0.log" Apr 16 18:56:17.738608 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.738578 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ad24bfd7-e54b-42e4-8994-427c01a4fba9/prometheus/0.log" Apr 16 18:56:17.756294 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.756268 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ad24bfd7-e54b-42e4-8994-427c01a4fba9/config-reloader/0.log" Apr 16 18:56:17.777223 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.777200 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ad24bfd7-e54b-42e4-8994-427c01a4fba9/thanos-sidecar/0.log" Apr 16 18:56:17.799883 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.799859 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ad24bfd7-e54b-42e4-8994-427c01a4fba9/kube-rbac-proxy-web/0.log" Apr 16 18:56:17.822961 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.822939 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ad24bfd7-e54b-42e4-8994-427c01a4fba9/kube-rbac-proxy/0.log" Apr 16 18:56:17.846559 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.846537 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ad24bfd7-e54b-42e4-8994-427c01a4fba9/kube-rbac-proxy-thanos/0.log" Apr 16 18:56:17.868871 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.868851 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ad24bfd7-e54b-42e4-8994-427c01a4fba9/init-config-reloader/0.log" Apr 16 18:56:17.894530 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.894508 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-s48sj_34e75ef8-cf4c-4fcc-8522-9700e612bd1a/prometheus-operator/0.log" Apr 16 18:56:17.916787 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.916762 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-s48sj_34e75ef8-cf4c-4fcc-8522-9700e612bd1a/kube-rbac-proxy/0.log" Apr 16 18:56:17.941587 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.941563 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-9qvvd_80b150cc-effc-4cfb-a442-ad71bfb82365/prometheus-operator-admission-webhook/0.log" Apr 16 18:56:17.970094 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.970068 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-794d85457c-r46ps_49fd2964-d995-433c-a395-164796728457/telemeter-client/0.log" Apr 16 18:56:17.999128 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:17.999105 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-794d85457c-r46ps_49fd2964-d995-433c-a395-164796728457/reload/0.log" Apr 16 18:56:18.021715 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:18.021691 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-794d85457c-r46ps_49fd2964-d995-433c-a395-164796728457/kube-rbac-proxy/0.log" Apr 16 18:56:20.324598 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.324564 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q"] Apr 16 18:56:20.324968 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.324898 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58195199-f074-44ab-9691-c5a07ca271ec" containerName="copy" Apr 16 18:56:20.324968 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.324908 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="58195199-f074-44ab-9691-c5a07ca271ec" containerName="copy" Apr 16 18:56:20.324968 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.324937 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58195199-f074-44ab-9691-c5a07ca271ec" containerName="gather" Apr 16 18:56:20.324968 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.324944 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="58195199-f074-44ab-9691-c5a07ca271ec" containerName="gather" Apr 16 18:56:20.325105 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.324992 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="58195199-f074-44ab-9691-c5a07ca271ec" containerName="gather" Apr 16 18:56:20.325105 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.325003 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="58195199-f074-44ab-9691-c5a07ca271ec" containerName="copy" Apr 16 18:56:20.327962 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.327843 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:20.330384 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.330362 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h2sf7\"/\"kube-root-ca.crt\"" Apr 16 18:56:20.331548 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.331530 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-h2sf7\"/\"default-dockercfg-vvgvg\"" Apr 16 18:56:20.331653 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.331533 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h2sf7\"/\"openshift-service-ca.crt\"" Apr 16 18:56:20.335185 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.335161 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q"] Apr 16 18:56:20.439991 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.439965 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/35fe0613-a18e-4dc3-aadb-12bb93758cec-proc\") pod \"perf-node-gather-daemonset-c9g5q\" (UID: \"35fe0613-a18e-4dc3-aadb-12bb93758cec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:20.440172 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.439995 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86x2x\" (UniqueName: \"kubernetes.io/projected/35fe0613-a18e-4dc3-aadb-12bb93758cec-kube-api-access-86x2x\") pod \"perf-node-gather-daemonset-c9g5q\" (UID: \"35fe0613-a18e-4dc3-aadb-12bb93758cec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:20.440172 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.440051 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/35fe0613-a18e-4dc3-aadb-12bb93758cec-sys\") pod \"perf-node-gather-daemonset-c9g5q\" (UID: \"35fe0613-a18e-4dc3-aadb-12bb93758cec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:20.440172 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.440068 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/35fe0613-a18e-4dc3-aadb-12bb93758cec-lib-modules\") pod \"perf-node-gather-daemonset-c9g5q\" (UID: \"35fe0613-a18e-4dc3-aadb-12bb93758cec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:20.440172 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.440138 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/35fe0613-a18e-4dc3-aadb-12bb93758cec-podres\") pod \"perf-node-gather-daemonset-c9g5q\" (UID: \"35fe0613-a18e-4dc3-aadb-12bb93758cec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:20.541281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.541248 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/35fe0613-a18e-4dc3-aadb-12bb93758cec-sys\") pod \"perf-node-gather-daemonset-c9g5q\" (UID: \"35fe0613-a18e-4dc3-aadb-12bb93758cec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:20.541281 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.541285 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/35fe0613-a18e-4dc3-aadb-12bb93758cec-lib-modules\") pod \"perf-node-gather-daemonset-c9g5q\" (UID: \"35fe0613-a18e-4dc3-aadb-12bb93758cec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:20.541450 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.541335 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/35fe0613-a18e-4dc3-aadb-12bb93758cec-podres\") pod \"perf-node-gather-daemonset-c9g5q\" (UID: \"35fe0613-a18e-4dc3-aadb-12bb93758cec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:20.541450 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.541367 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/35fe0613-a18e-4dc3-aadb-12bb93758cec-proc\") pod \"perf-node-gather-daemonset-c9g5q\" (UID: \"35fe0613-a18e-4dc3-aadb-12bb93758cec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:20.541450 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.541373 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/35fe0613-a18e-4dc3-aadb-12bb93758cec-sys\") pod \"perf-node-gather-daemonset-c9g5q\" (UID: \"35fe0613-a18e-4dc3-aadb-12bb93758cec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:20.541450 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.541392 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86x2x\" (UniqueName: \"kubernetes.io/projected/35fe0613-a18e-4dc3-aadb-12bb93758cec-kube-api-access-86x2x\") pod \"perf-node-gather-daemonset-c9g5q\" (UID: \"35fe0613-a18e-4dc3-aadb-12bb93758cec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:20.541450 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.541440 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/35fe0613-a18e-4dc3-aadb-12bb93758cec-proc\") pod \"perf-node-gather-daemonset-c9g5q\" (UID: \"35fe0613-a18e-4dc3-aadb-12bb93758cec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:20.541660 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.541510 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/35fe0613-a18e-4dc3-aadb-12bb93758cec-podres\") pod \"perf-node-gather-daemonset-c9g5q\" (UID: \"35fe0613-a18e-4dc3-aadb-12bb93758cec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:20.541660 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.541517 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/35fe0613-a18e-4dc3-aadb-12bb93758cec-lib-modules\") pod \"perf-node-gather-daemonset-c9g5q\" (UID: \"35fe0613-a18e-4dc3-aadb-12bb93758cec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:20.549663 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.549636 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86x2x\" (UniqueName: \"kubernetes.io/projected/35fe0613-a18e-4dc3-aadb-12bb93758cec-kube-api-access-86x2x\") pod \"perf-node-gather-daemonset-c9g5q\" (UID: \"35fe0613-a18e-4dc3-aadb-12bb93758cec\") " pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:20.638757 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.638675 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:20.757150 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:20.757126 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q"] Apr 16 18:56:20.759213 ip-10-0-129-166 kubenswrapper[2570]: W0416 18:56:20.759189 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod35fe0613_a18e_4dc3_aadb_12bb93758cec.slice/crio-8a74f7ddd8ba48ebf5c929b47bb7a1fb61a432e0f39ed2d0d501236b8e8230ed WatchSource:0}: Error finding container 8a74f7ddd8ba48ebf5c929b47bb7a1fb61a432e0f39ed2d0d501236b8e8230ed: Status 404 returned error can't find the container with id 8a74f7ddd8ba48ebf5c929b47bb7a1fb61a432e0f39ed2d0d501236b8e8230ed Apr 16 18:56:21.067840 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:21.067767 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lp728_8529cef7-b4bb-4d9b-9a9d-cd0b821f2437/dns/0.log" Apr 16 18:56:21.087143 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:21.087112 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lp728_8529cef7-b4bb-4d9b-9a9d-cd0b821f2437/kube-rbac-proxy/0.log" Apr 16 18:56:21.127999 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:21.127968 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cljk9_3cdc1460-1781-45b3-ad12-0173537882af/dns-node-resolver/0.log" Apr 16 18:56:21.419131 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:21.419099 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" event={"ID":"35fe0613-a18e-4dc3-aadb-12bb93758cec","Type":"ContainerStarted","Data":"c4ffdde03e5260a4335e82a5eeb8447fab049e69a4bbffa6aac7a3e929e6e33c"} Apr 16 18:56:21.419131 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:21.419132 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" event={"ID":"35fe0613-a18e-4dc3-aadb-12bb93758cec","Type":"ContainerStarted","Data":"8a74f7ddd8ba48ebf5c929b47bb7a1fb61a432e0f39ed2d0d501236b8e8230ed"} Apr 16 18:56:21.419526 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:21.419159 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:21.435533 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:21.435488 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" podStartSLOduration=1.43547413 podStartE2EDuration="1.43547413s" podCreationTimestamp="2026-04-16 18:56:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:56:21.433556019 +0000 UTC m=+1581.151413291" watchObservedRunningTime="2026-04-16 18:56:21.43547413 +0000 UTC m=+1581.153331391" Apr 16 18:56:21.545747 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:21.545720 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2z4jk_9521e1df-4c34-4a19-bce1-983c6712cca8/node-ca/0.log" Apr 16 18:56:22.617771 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:22.617736 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-s87rb_a25871f6-0ad2-44ac-9f9c-492a30345e0e/serve-healthcheck-canary/0.log" Apr 16 18:56:23.105581 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:23.105551 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kwg6q_26032924-872b-462e-9763-af466c1929a8/kube-rbac-proxy/0.log" Apr 16 18:56:23.132804 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:23.132774 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kwg6q_26032924-872b-462e-9763-af466c1929a8/exporter/0.log" Apr 16 18:56:23.153983 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:23.153957 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kwg6q_26032924-872b-462e-9763-af466c1929a8/extractor/0.log" Apr 16 18:56:25.144488 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:25.144456 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-jghww_4ee0c56e-b9f3-4b8a-9751-08862dff1dcd/manager/0.log" Apr 16 18:56:25.164657 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:25.164637 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-w22xc_b25b7418-bcf6-451c-a994-75c6c69740b2/s3-init/0.log" Apr 16 18:56:25.191058 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:25.191028 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-2xdrd_913bb92b-1184-4a71-85de-86f726afd42f/seaweedfs/0.log" Apr 16 18:56:27.431769 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:27.431742 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-h2sf7/perf-node-gather-daemonset-c9g5q" Apr 16 18:56:28.843256 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:28.843222 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-6p5zv_5e59f99d-1bfa-4b7d-96e7-66ff560447ba/migrator/0.log" Apr 16 18:56:28.863347 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:28.863317 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-6p5zv_5e59f99d-1bfa-4b7d-96e7-66ff560447ba/graceful-termination/0.log" Apr 16 18:56:30.488507 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:30.488477 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9zzzv_e4fd3994-9933-4354-aff4-2baed763eb94/kube-multus-additional-cni-plugins/0.log" Apr 16 18:56:30.512350 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:30.512319 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9zzzv_e4fd3994-9933-4354-aff4-2baed763eb94/egress-router-binary-copy/0.log" Apr 16 18:56:30.533975 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:30.533950 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9zzzv_e4fd3994-9933-4354-aff4-2baed763eb94/cni-plugins/0.log" Apr 16 18:56:30.554162 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:30.554141 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9zzzv_e4fd3994-9933-4354-aff4-2baed763eb94/bond-cni-plugin/0.log" Apr 16 18:56:30.576011 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:30.575988 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9zzzv_e4fd3994-9933-4354-aff4-2baed763eb94/routeoverride-cni/0.log" Apr 16 18:56:30.602797 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:30.602772 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9zzzv_e4fd3994-9933-4354-aff4-2baed763eb94/whereabouts-cni-bincopy/0.log" Apr 16 18:56:30.622331 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:30.622307 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9zzzv_e4fd3994-9933-4354-aff4-2baed763eb94/whereabouts-cni/0.log" Apr 16 18:56:30.832816 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:30.832793 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sqh7m_ccad455d-b7e3-4ad9-b224-de9cedc28cb3/kube-multus/0.log" Apr 16 18:56:30.858270 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:30.858241 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kc2vf_1d7d2281-07bb-4906-844c-f53fbfe57143/network-metrics-daemon/0.log" Apr 16 18:56:30.881115 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:30.881092 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kc2vf_1d7d2281-07bb-4906-844c-f53fbfe57143/kube-rbac-proxy/0.log" Apr 16 18:56:31.803936 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:31.803902 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovn-controller/0.log" Apr 16 18:56:31.827091 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:31.827066 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovn-acl-logging/0.log" Apr 16 18:56:31.833432 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:31.833413 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovn-acl-logging/1.log" Apr 16 18:56:31.850698 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:31.850676 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/kube-rbac-proxy-node/0.log" Apr 16 18:56:31.873594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:31.873570 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:56:31.898467 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:31.898446 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/northd/0.log" Apr 16 18:56:31.923491 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:31.923458 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/nbdb/0.log" Apr 16 18:56:31.945132 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:31.945059 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/sbdb/0.log" Apr 16 18:56:32.032594 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:32.032561 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8t42k_9b204315-a289-4466-91fd-0714100a1752/ovnkube-controller/0.log" Apr 16 18:56:33.909464 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:33.909426 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-lfln6_7869b7a3-ff6a-4bf3-a99e-c2dd0d14231a/check-endpoints/0.log" Apr 16 18:56:34.013589 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:34.013559 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-vdcwk_f3ade13f-7d4c-4574-bfc3-a946ccc0dd37/network-check-target-container/0.log" Apr 16 18:56:34.885442 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:34.885412 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-mdxtz_d3f2c90c-d5bc-430b-8c49-774156034361/iptables-alerter/0.log" Apr 16 18:56:35.531108 ip-10-0-129-166 kubenswrapper[2570]: I0416 18:56:35.531076 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-dwhd9_93747e20-acab-493c-8520-f3b549e0c240/tuned/0.log"