Apr 24 16:36:41.408358 ip-10-0-143-104 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 16:36:41.408368 ip-10-0-143-104 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 16:36:41.408375 ip-10-0-143-104 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 16:36:41.408607 ip-10-0-143-104 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 16:36:51.628775 ip-10-0-143-104 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 16:36:51.628790 ip-10-0-143-104 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot d5fd0a431b26443484c449192179a93a -- Apr 24 16:39:24.198846 ip-10-0-143-104 systemd[1]: Starting Kubernetes Kubelet... Apr 24 16:39:24.653435 ip-10-0-143-104 kubenswrapper[2581]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:39:24.653435 ip-10-0-143-104 kubenswrapper[2581]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 16:39:24.653435 ip-10-0-143-104 kubenswrapper[2581]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:39:24.653435 ip-10-0-143-104 kubenswrapper[2581]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 16:39:24.653435 ip-10-0-143-104 kubenswrapper[2581]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:39:24.654337 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.654202 2581 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 16:39:24.657418 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657403 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:24.657418 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657419 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657422 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657425 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657428 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657431 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657434 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657437 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657440 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657443 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657445 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657448 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657451 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657453 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657456 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657460 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657463 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657466 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657468 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657471 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:24.657541 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657476 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657480 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657483 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657486 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657490 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657493 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657496 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657513 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657517 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657520 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657523 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657526 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657528 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657531 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657535 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657538 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657542 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657545 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657548 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657551 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:24.658008 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657554 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657557 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657559 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657562 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657565 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657568 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657570 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657574 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657578 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657581 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657584 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657586 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657589 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657591 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657594 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657597 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657599 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657602 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657605 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657607 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:24.658488 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657610 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657613 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657623 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657626 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657629 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657631 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657634 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657637 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657640 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657643 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657648 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657650 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657653 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657656 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657658 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657661 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657663 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657666 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657669 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657671 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:24.659019 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657674 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657677 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657680 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657691 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657694 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.657696 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658112 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658117 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658121 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658123 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658126 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658129 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658132 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658135 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658137 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658140 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658142 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658145 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658148 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658151 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:24.659493 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658153 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658156 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658159 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658162 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658165 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658167 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658170 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658172 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658175 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658178 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658180 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658183 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658185 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658187 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658190 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658193 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658196 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658200 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658203 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658206 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:24.659993 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658209 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658212 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658214 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658217 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658220 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658222 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658225 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658228 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658230 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658233 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658236 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658240 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658242 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658245 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658247 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658250 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658253 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658256 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658258 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:24.660490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658260 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658263 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658266 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658268 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658270 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658273 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658275 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658278 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658281 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658283 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658286 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658288 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658291 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658294 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658298 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658302 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658305 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658308 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658310 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:24.660971 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658313 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658316 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658318 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658321 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658324 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658327 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658330 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658333 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658335 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658338 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658341 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658344 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658346 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.658349 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658423 2581 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658430 2581 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658437 2581 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658442 2581 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658447 2581 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658450 2581 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658455 2581 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 16:39:24.661469 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658459 2581 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658463 2581 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658466 2581 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658470 2581 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658473 2581 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658476 2581 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658479 2581 flags.go:64] FLAG: --cgroup-root="" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658482 2581 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658485 2581 flags.go:64] FLAG: --client-ca-file="" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658488 2581 flags.go:64] FLAG: --cloud-config="" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658491 2581 flags.go:64] FLAG: --cloud-provider="external" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658494 2581 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658514 2581 flags.go:64] FLAG: --cluster-domain="" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658518 2581 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658521 2581 flags.go:64] FLAG: --config-dir="" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658524 2581 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658528 2581 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658532 2581 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658535 2581 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658538 2581 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658541 2581 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658544 2581 flags.go:64] FLAG: --contention-profiling="false" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658549 2581 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658552 2581 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658556 2581 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 16:39:24.661994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658558 2581 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658563 2581 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658566 2581 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658569 2581 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658572 2581 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658575 2581 flags.go:64] FLAG: --enable-server="true" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658578 2581 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658586 2581 flags.go:64] FLAG: --event-burst="100" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658589 2581 flags.go:64] FLAG: --event-qps="50" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658592 2581 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658595 2581 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658598 2581 flags.go:64] FLAG: --eviction-hard="" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658602 2581 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658605 2581 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658608 2581 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658612 2581 flags.go:64] FLAG: --eviction-soft="" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658615 2581 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658618 2581 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658621 2581 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658624 2581 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658627 2581 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658630 2581 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658632 2581 flags.go:64] FLAG: --feature-gates="" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658636 2581 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658639 2581 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 16:39:24.662687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658642 2581 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658646 2581 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658649 2581 flags.go:64] FLAG: --healthz-port="10248" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658652 2581 flags.go:64] FLAG: --help="false" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658654 2581 flags.go:64] FLAG: --hostname-override="ip-10-0-143-104.ec2.internal" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658658 2581 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658661 2581 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658664 2581 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658668 2581 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658671 2581 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658675 2581 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658677 2581 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658680 2581 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658683 2581 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658686 2581 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658689 2581 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658692 2581 flags.go:64] FLAG: --kube-reserved="" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658695 2581 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658698 2581 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658701 2581 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658704 2581 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658709 2581 flags.go:64] FLAG: --lock-file="" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658712 2581 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658715 2581 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 16:39:24.663290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658718 2581 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658724 2581 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658727 2581 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658729 2581 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658732 2581 flags.go:64] FLAG: --logging-format="text" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658735 2581 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658739 2581 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658742 2581 flags.go:64] FLAG: --manifest-url="" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658745 2581 flags.go:64] FLAG: --manifest-url-header="" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658749 2581 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658752 2581 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658756 2581 flags.go:64] FLAG: --max-pods="110" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658760 2581 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658763 2581 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658766 2581 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658769 2581 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658775 2581 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658778 2581 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658782 2581 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658790 2581 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658793 2581 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658796 2581 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658799 2581 flags.go:64] FLAG: --pod-cidr="" Apr 24 16:39:24.663883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658802 2581 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658808 2581 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658811 2581 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658814 2581 flags.go:64] FLAG: --pods-per-core="0" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658817 2581 flags.go:64] FLAG: --port="10250" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658820 2581 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658824 2581 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-063dc0a4887f86a08" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658827 2581 flags.go:64] FLAG: --qos-reserved="" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658830 2581 flags.go:64] FLAG: --read-only-port="10255" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658833 2581 flags.go:64] FLAG: --register-node="true" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658836 2581 flags.go:64] FLAG: --register-schedulable="true" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658839 2581 flags.go:64] FLAG: --register-with-taints="" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658843 2581 flags.go:64] FLAG: --registry-burst="10" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658846 2581 flags.go:64] FLAG: --registry-qps="5" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658848 2581 flags.go:64] FLAG: --reserved-cpus="" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658851 2581 flags.go:64] FLAG: --reserved-memory="" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658855 2581 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658859 2581 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658862 2581 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658865 2581 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658868 2581 flags.go:64] FLAG: --runonce="false" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658870 2581 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658873 2581 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658876 2581 flags.go:64] FLAG: --seccomp-default="false" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658880 2581 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658883 2581 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 16:39:24.664439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658888 2581 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658891 2581 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658894 2581 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658897 2581 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658900 2581 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658903 2581 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658905 2581 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658908 2581 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658911 2581 flags.go:64] FLAG: --system-cgroups="" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658914 2581 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658919 2581 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658922 2581 flags.go:64] FLAG: --tls-cert-file="" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658926 2581 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658930 2581 flags.go:64] FLAG: --tls-min-version="" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658933 2581 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658936 2581 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658939 2581 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658942 2581 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658945 2581 flags.go:64] FLAG: --v="2" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658949 2581 flags.go:64] FLAG: --version="false" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658953 2581 flags.go:64] FLAG: --vmodule="" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658958 2581 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.658961 2581 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659050 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:24.665066 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659054 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659057 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659060 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659063 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659066 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659068 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659071 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659074 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659079 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659081 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659084 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659086 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659089 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659091 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659094 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659096 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659099 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659101 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659104 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659107 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:24.665665 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659110 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659112 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659115 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659118 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659120 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659123 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659126 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659128 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659131 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659133 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659136 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659138 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659141 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659143 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659146 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659150 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659153 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659156 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659159 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659162 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:24.666190 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659167 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659170 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659173 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659177 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659179 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659182 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659186 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659188 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659191 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659194 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659197 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659200 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659203 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659205 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659208 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659211 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659213 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659216 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659219 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659221 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:24.666692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659224 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659226 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659229 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659232 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659234 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659237 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659239 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659241 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659244 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659246 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659249 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659251 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659255 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659258 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659260 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659263 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659265 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659268 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659270 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:24.667176 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659274 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:24.667653 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659278 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:24.667653 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659281 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:24.667653 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659283 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:24.667653 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659286 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:24.667653 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.659289 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:24.667653 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.659961 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:39:24.667653 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.667176 2581 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 16:39:24.667653 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.667192 2581 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 16:39:24.667653 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667241 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:24.667653 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667247 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:24.667653 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667250 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:24.667653 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667254 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:24.667653 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667257 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:24.667653 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667260 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:24.667653 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667263 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:24.667653 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667266 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667269 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667271 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667275 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667280 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667283 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667286 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667289 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667292 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667295 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667298 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667300 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667303 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667306 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667308 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667311 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667313 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667316 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667318 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:24.668086 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667322 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667324 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667327 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667329 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667332 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667336 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667339 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667342 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667344 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667347 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667349 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667352 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667354 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667357 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667359 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667362 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667365 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667367 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667370 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667372 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:24.668568 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667374 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667377 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667379 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667382 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667385 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667387 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667390 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667392 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667395 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667397 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667400 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667402 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667405 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667407 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667409 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667412 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667415 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667419 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667425 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667428 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:24.669058 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667430 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667433 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667435 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667438 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667440 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667443 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667445 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667448 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667451 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667453 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667456 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667459 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667461 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667464 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667467 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667470 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667474 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667477 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667480 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:24.669554 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667483 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:24.670031 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.667488 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:39:24.670031 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667608 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:24.670031 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667613 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:24.670031 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667617 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:24.670031 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667620 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:24.670031 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667623 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:24.670031 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667625 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:24.670031 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667628 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:24.670031 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667630 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:24.670031 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667634 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:24.670031 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667637 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:24.670031 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667640 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:24.670031 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667642 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:24.670031 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667645 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:24.670031 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667648 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667651 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667654 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667657 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667660 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667662 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667665 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667668 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667670 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667673 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667675 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667678 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667680 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667683 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667685 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667688 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667691 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667693 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667696 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667698 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:24.670426 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667701 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667703 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667707 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667711 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667714 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667716 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667719 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667722 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667725 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667728 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667731 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667733 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667736 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667739 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667742 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667744 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667747 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667749 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667752 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:24.670927 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667754 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667757 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667760 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667762 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667765 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667767 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667770 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667772 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667775 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667777 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667780 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667783 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667786 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667789 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667791 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667793 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667796 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667798 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667801 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667803 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:24.671398 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667807 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:24.671900 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667809 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:24.671900 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667812 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:24.671900 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667814 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:24.671900 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667817 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:24.671900 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667819 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:24.671900 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667822 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:24.671900 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667825 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:24.671900 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667829 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:24.671900 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667831 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:24.671900 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667834 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:24.671900 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667836 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:24.671900 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667839 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:24.671900 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:24.667842 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:24.671900 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.667847 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:39:24.671900 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.668622 2581 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 16:39:24.672275 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.670637 2581 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 16:39:24.672275 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.671533 2581 server.go:1019] "Starting client certificate rotation" Apr 24 16:39:24.672275 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.671624 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:39:24.672469 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.672457 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:39:24.696653 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.696632 2581 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:39:24.702004 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.701978 2581 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:39:24.715195 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.715175 2581 log.go:25] "Validated CRI v1 runtime API" Apr 24 16:39:24.721592 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.721573 2581 log.go:25] "Validated CRI v1 image API" Apr 24 16:39:24.723410 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.723392 2581 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 16:39:24.728295 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.728270 2581 fs.go:135] Filesystem UUIDs: map[2445be63-19d5-4d20-91dd-081837740820:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 9c5c4eeb-23b9-4b64-aee8-d46c5a8ac7a5:/dev/nvme0n1p4] Apr 24 16:39:24.728352 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.728295 2581 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 16:39:24.734188 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.734168 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:39:24.734698 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.734569 2581 manager.go:217] Machine: {Timestamp:2026-04-24 16:39:24.732665937 +0000 UTC m=+0.418390173 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3158664 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b2eadb553eb3092ddf083da43e2a7 SystemUUID:ec2b2ead-b553-eb30-92dd-f083da43e2a7 BootID:d5fd0a43-1b26-4434-84c4-49192179a93a Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a1:b7:b9:e6:09 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a1:b7:b9:e6:09 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:e2:c3:2a:87:be:b8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 16:39:24.734698 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.734693 2581 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 16:39:24.734828 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.734816 2581 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 16:39:24.735953 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.735928 2581 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 16:39:24.736088 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.735955 2581 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-104.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 16:39:24.736136 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.736096 2581 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 16:39:24.736136 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.736106 2581 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 16:39:24.736136 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.736119 2581 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:39:24.736803 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.736790 2581 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:39:24.737643 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.737633 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:39:24.737927 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.737918 2581 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 16:39:24.740353 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.740343 2581 kubelet.go:491] "Attempting to sync node with API server" Apr 24 16:39:24.740392 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.740357 2581 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 16:39:24.740392 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.740372 2581 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 16:39:24.740392 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.740381 2581 kubelet.go:397] "Adding apiserver pod source" Apr 24 16:39:24.740392 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.740390 2581 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 16:39:24.741584 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.741571 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:39:24.741622 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.741598 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:39:24.744756 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.744741 2581 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 16:39:24.748789 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.748730 2581 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 16:39:24.750336 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.750314 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 16:39:24.750336 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.750336 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 16:39:24.750462 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.750344 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 16:39:24.750462 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.750353 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 16:39:24.750462 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.750361 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 16:39:24.750462 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.750369 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 16:39:24.750462 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.750378 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 16:39:24.750462 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.750386 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 16:39:24.750462 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.750397 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 16:39:24.750462 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.750407 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 16:39:24.750462 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.750430 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 16:39:24.750462 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.750443 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 16:39:24.751292 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.751280 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 16:39:24.751339 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.751296 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 16:39:24.753630 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:24.753605 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 16:39:24.753630 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:24.753618 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-104.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 16:39:24.754949 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.754934 2581 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 16:39:24.755033 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.754976 2581 server.go:1295] "Started kubelet" Apr 24 16:39:24.755124 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.755045 2581 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 16:39:24.755168 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.755118 2581 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 16:39:24.755209 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.755176 2581 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 16:39:24.755899 ip-10-0-143-104 systemd[1]: Started Kubernetes Kubelet. Apr 24 16:39:24.756826 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.756670 2581 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 16:39:24.758139 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.758122 2581 server.go:317] "Adding debug handlers to kubelet server" Apr 24 16:39:24.761891 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.761872 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 16:39:24.762406 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.762392 2581 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 16:39:24.763143 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.763127 2581 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 16:39:24.763143 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.763129 2581 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 16:39:24.763275 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.763156 2581 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 16:39:24.763337 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.763326 2581 reconstruct.go:97] "Volume reconstruction finished" Apr 24 16:39:24.763383 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.763339 2581 reconciler.go:26] "Reconciler: start to sync state" Apr 24 16:39:24.763859 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:24.763330 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-104.ec2.internal\" not found" Apr 24 16:39:24.765355 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.765340 2581 factory.go:55] Registering systemd factory Apr 24 16:39:24.765443 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.765363 2581 factory.go:223] Registration of the systemd container factory successfully Apr 24 16:39:24.765653 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.765628 2581 factory.go:153] Registering CRI-O factory Apr 24 16:39:24.765653 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.765648 2581 factory.go:223] Registration of the crio container factory successfully Apr 24 16:39:24.765781 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.765767 2581 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 16:39:24.765831 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.765800 2581 factory.go:103] Registering Raw factory Apr 24 16:39:24.765831 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.765817 2581 manager.go:1196] Started watching for new ooms in manager Apr 24 16:39:24.766435 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.766223 2581 manager.go:319] Starting recovery of all containers Apr 24 16:39:24.767438 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:24.767393 2581 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 16:39:24.769254 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:24.769232 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 16:39:24.769424 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:24.769407 2581 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-104.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 16:39:24.769551 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.769464 2581 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-104.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 16:39:24.770572 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:24.769608 2581 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-104.ec2.internal.18a9586fb3c3eb64 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-104.ec2.internal,UID:ip-10-0-143-104.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-104.ec2.internal,},FirstTimestamp:2026-04-24 16:39:24.754946916 +0000 UTC m=+0.440671153,LastTimestamp:2026-04-24 16:39:24.754946916 +0000 UTC m=+0.440671153,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-104.ec2.internal,}" Apr 24 16:39:24.778347 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.778320 2581 manager.go:324] Recovery completed Apr 24 16:39:24.783953 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.783939 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:24.786687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.786566 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:24.786687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.786622 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:24.786687 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.786640 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:24.787171 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.787157 2581 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 16:39:24.787171 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.787170 2581 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 16:39:24.787254 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.787188 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:39:24.788888 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:24.788824 2581 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-104.ec2.internal.18a9586fb5a6abf6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-104.ec2.internal,UID:ip-10-0-143-104.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-143-104.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-143-104.ec2.internal,},FirstTimestamp:2026-04-24 16:39:24.786584566 +0000 UTC m=+0.472308801,LastTimestamp:2026-04-24 16:39:24.786584566 +0000 UTC m=+0.472308801,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-104.ec2.internal,}" Apr 24 16:39:24.789519 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.789489 2581 policy_none.go:49] "None policy: Start" Apr 24 16:39:24.789577 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.789523 2581 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 16:39:24.789577 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.789534 2581 state_mem.go:35] "Initializing new in-memory state store" Apr 24 16:39:24.797559 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:24.797476 2581 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-104.ec2.internal.18a9586fb5a75dd7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-104.ec2.internal,UID:ip-10-0-143-104.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-143-104.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-143-104.ec2.internal,},FirstTimestamp:2026-04-24 16:39:24.786630103 +0000 UTC m=+0.472354340,LastTimestamp:2026-04-24 16:39:24.786630103 +0000 UTC m=+0.472354340,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-104.ec2.internal,}" Apr 24 16:39:24.801682 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.801655 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-55dqs" Apr 24 16:39:24.808445 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:24.808367 2581 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-104.ec2.internal.18a9586fb5a7a039 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-104.ec2.internal,UID:ip-10-0-143-104.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-143-104.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-143-104.ec2.internal,},FirstTimestamp:2026-04-24 16:39:24.786647097 +0000 UTC m=+0.472371338,LastTimestamp:2026-04-24 16:39:24.786647097 +0000 UTC m=+0.472371338,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-104.ec2.internal,}" Apr 24 16:39:24.822141 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.822120 2581 manager.go:341] "Starting Device Plugin manager" Apr 24 16:39:24.831519 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:24.822215 2581 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 16:39:24.831519 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.822231 2581 server.go:85] "Starting device plugin registration server" Apr 24 16:39:24.831519 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.822372 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-55dqs" Apr 24 16:39:24.831519 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.822548 2581 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 16:39:24.831519 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.822559 2581 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 16:39:24.831519 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.822643 2581 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 16:39:24.831519 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.822723 2581 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 16:39:24.831519 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.822732 2581 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 16:39:24.831519 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:24.823334 2581 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 16:39:24.831519 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:24.823373 2581 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-104.ec2.internal\" not found" Apr 24 16:39:24.879570 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.879527 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 16:39:24.880733 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.880720 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 16:39:24.880797 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.880747 2581 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 16:39:24.880797 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.880768 2581 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 16:39:24.880797 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.880775 2581 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 16:39:24.880912 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:24.880809 2581 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 16:39:24.883281 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.883258 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:24.922896 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.922810 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:24.923826 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.923806 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:24.923826 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.923838 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:24.923977 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.923849 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:24.923977 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.923876 2581 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-104.ec2.internal" Apr 24 16:39:24.935354 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.935337 2581 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-104.ec2.internal" Apr 24 16:39:24.935414 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:24.935361 2581 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-104.ec2.internal\": node \"ip-10-0-143-104.ec2.internal\" not found" Apr 24 16:39:24.963357 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:24.963335 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-104.ec2.internal\" not found" Apr 24 16:39:24.981806 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.981777 2581 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-104.ec2.internal"] Apr 24 16:39:24.981877 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.981860 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:24.982811 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.982797 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:24.982861 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.982827 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:24.982861 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.982837 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:24.984063 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.984051 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:24.984246 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.984233 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal" Apr 24 16:39:24.984279 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.984260 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:24.984758 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.984738 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:24.984877 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.984769 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:24.984877 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.984778 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:24.984877 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.984743 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:24.984877 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.984844 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:24.984877 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.984861 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:24.985833 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.985818 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-104.ec2.internal" Apr 24 16:39:24.985906 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.985842 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:24.986515 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.986490 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:24.986580 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.986530 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:24.986580 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:24.986543 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:25.011177 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:25.011156 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-104.ec2.internal\" not found" node="ip-10-0-143-104.ec2.internal" Apr 24 16:39:25.015527 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:25.015496 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-104.ec2.internal\" not found" node="ip-10-0-143-104.ec2.internal" Apr 24 16:39:25.063973 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:25.063946 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-104.ec2.internal\" not found" Apr 24 16:39:25.066159 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.066143 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/aee8a8bce52f235626e7af91e66c220e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal\" (UID: \"aee8a8bce52f235626e7af91e66c220e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal" Apr 24 16:39:25.066212 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.066169 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aee8a8bce52f235626e7af91e66c220e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal\" (UID: \"aee8a8bce52f235626e7af91e66c220e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal" Apr 24 16:39:25.066212 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.066191 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/acbc91bbcdbc790f59d9cba82c01d807-config\") pod \"kube-apiserver-proxy-ip-10-0-143-104.ec2.internal\" (UID: \"acbc91bbcdbc790f59d9cba82c01d807\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-104.ec2.internal" Apr 24 16:39:25.164053 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:25.164023 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-104.ec2.internal\" not found" Apr 24 16:39:25.167241 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.167223 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/aee8a8bce52f235626e7af91e66c220e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal\" (UID: \"aee8a8bce52f235626e7af91e66c220e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal" Apr 24 16:39:25.167292 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.167250 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aee8a8bce52f235626e7af91e66c220e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal\" (UID: \"aee8a8bce52f235626e7af91e66c220e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal" Apr 24 16:39:25.167292 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.167268 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/acbc91bbcdbc790f59d9cba82c01d807-config\") pod \"kube-apiserver-proxy-ip-10-0-143-104.ec2.internal\" (UID: \"acbc91bbcdbc790f59d9cba82c01d807\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-104.ec2.internal" Apr 24 16:39:25.167357 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.167332 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/aee8a8bce52f235626e7af91e66c220e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal\" (UID: \"aee8a8bce52f235626e7af91e66c220e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal" Apr 24 16:39:25.167357 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.167339 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aee8a8bce52f235626e7af91e66c220e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal\" (UID: \"aee8a8bce52f235626e7af91e66c220e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal" Apr 24 16:39:25.167416 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.167331 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/acbc91bbcdbc790f59d9cba82c01d807-config\") pod \"kube-apiserver-proxy-ip-10-0-143-104.ec2.internal\" (UID: \"acbc91bbcdbc790f59d9cba82c01d807\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-104.ec2.internal" Apr 24 16:39:25.264353 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:25.264312 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-104.ec2.internal\" not found" Apr 24 16:39:25.313663 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.313632 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal" Apr 24 16:39:25.318283 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.318260 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-104.ec2.internal" Apr 24 16:39:25.365179 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:25.365145 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-104.ec2.internal\" not found" Apr 24 16:39:25.465617 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:25.465595 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-104.ec2.internal\" not found" Apr 24 16:39:25.566235 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:25.566165 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-104.ec2.internal\" not found" Apr 24 16:39:25.623062 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.623031 2581 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:25.663009 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.662974 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal" Apr 24 16:39:25.670952 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.670934 2581 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 16:39:25.671077 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.671058 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 16:39:25.671120 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.671098 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 16:39:25.671149 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:25.671113 2581 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://ad5a83301898545a2a4b6f9fcd26aa85-e35855ac38ca2517.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.143.104:43764->3.210.230.117:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal" Apr 24 16:39:25.671149 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.671135 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-104.ec2.internal" Apr 24 16:39:25.692243 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.692219 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:39:25.740585 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.740547 2581 apiserver.go:52] "Watching apiserver" Apr 24 16:39:25.748445 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.748421 2581 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 16:39:25.749455 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.749365 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-rx464","openshift-multus/multus-additional-cni-plugins-7xj6k","openshift-multus/network-metrics-daemon-q5b2h","openshift-ovn-kubernetes/ovnkube-node-k59gs","kube-system/kube-apiserver-proxy-ip-10-0-143-104.ec2.internal","openshift-dns/node-resolver-z9v4l","openshift-image-registry/node-ca-vcjb9","openshift-multus/multus-8vprt","openshift-network-diagnostics/network-check-target-9wjxs","openshift-network-operator/iptables-alerter-9ldxt","kube-system/konnectivity-agent-m7ftj","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc"] Apr 24 16:39:25.752017 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.751992 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-z9v4l" Apr 24 16:39:25.754142 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.754119 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-m7ftj" Apr 24 16:39:25.754330 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.754312 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 16:39:25.754408 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.754391 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 16:39:25.754546 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.754532 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-vbkkw\"" Apr 24 16:39:25.755313 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.755293 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.756007 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.755990 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 16:39:25.756076 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.756013 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-mtr2z\"" Apr 24 16:39:25.756207 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.756195 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 16:39:25.756558 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.756546 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.756783 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.756683 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:25.756783 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:25.756747 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:39:25.757084 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.757066 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:39:25.757163 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.757148 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7t26t\"" Apr 24 16:39:25.757291 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.757272 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 16:39:25.757912 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.757892 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.758819 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.758804 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 16:39:25.759091 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.759080 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vcjb9" Apr 24 16:39:25.760280 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.760266 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.760564 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.760547 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 16:39:25.760778 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.760763 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 16:39:25.760857 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.760791 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 16:39:25.760913 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.760895 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 16:39:25.760986 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.760972 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 16:39:25.761042 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.760985 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 16:39:25.761042 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.760992 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 16:39:25.761313 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.761297 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 16:39:25.761396 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.761341 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 16:39:25.761456 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.761402 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 16:39:25.761529 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.761489 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-q4njf\"" Apr 24 16:39:25.761529 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.761495 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 16:39:25.761529 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.761513 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 16:39:25.761688 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.761538 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 16:39:25.761793 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.761777 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4h7z6\"" Apr 24 16:39:25.761882 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.761868 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dzdl9\"" Apr 24 16:39:25.761975 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.761962 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 16:39:25.764065 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.763185 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:25.764065 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.763271 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9ldxt" Apr 24 16:39:25.764065 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:25.763278 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:39:25.764903 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.764881 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 16:39:25.765115 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.765099 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-tr4zg\"" Apr 24 16:39:25.765250 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.765225 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.766018 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.765714 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 16:39:25.766018 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.765762 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:39:25.766018 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.765844 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 16:39:25.766018 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.765887 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-hpbqv\"" Apr 24 16:39:25.769707 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.769682 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 16:39:25.769805 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.769748 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 16:39:25.769883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.769859 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xqkrc\"" Apr 24 16:39:25.770099 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.770077 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-node-log\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.770183 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.770111 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-log-socket\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.770183 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.770133 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-run-ovn-kubernetes\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.770183 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.770157 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-hostroot\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.770183 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.770176 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7gbg\" (UniqueName: \"kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg\") pod \"network-check-target-9wjxs\" (UID: \"4fda4ceb-5ea7-4202-903b-a9a5b5152485\") " pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:25.770396 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.770198 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-host\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.770396 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.770223 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.770396 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.770239 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a611eeef-0446-421a-b3c5-d38e773087f7-cnibin\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.770396 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.770289 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0048dae9-a5eb-4707-9a78-5385f148fdf1-env-overrides\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.770396 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.770320 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-multus-socket-dir-parent\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.770396 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.770348 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/93e0ec36-5590-491f-b620-59d8d420540c-socket-dir\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.770396 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.770372 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-run\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.770396 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.770393 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f9857732-ae10-4c66-8e34-589690779e84-tmp\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.770813 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.770417 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a611eeef-0446-421a-b3c5-d38e773087f7-cni-binary-copy\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.770813 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.770442 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a611eeef-0446-421a-b3c5-d38e773087f7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.770813 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.770473 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-system-cni-dir\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.770813 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.770525 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-os-release\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.771761 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.771105 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt9xh\" (UniqueName: \"kubernetes.io/projected/8a2acb5b-dd6f-415a-a081-ae20b03878ff-kube-api-access-xt9xh\") pod \"iptables-alerter-9ldxt\" (UID: \"8a2acb5b-dd6f-415a-a081-ae20b03878ff\") " pod="openshift-network-operator/iptables-alerter-9ldxt" Apr 24 16:39:25.771761 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.771254 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a13ed17d-5b83-44be-8c88-f98632b2ac89-tmp-dir\") pod \"node-resolver-z9v4l\" (UID: \"a13ed17d-5b83-44be-8c88-f98632b2ac89\") " pod="openshift-dns/node-resolver-z9v4l" Apr 24 16:39:25.771761 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.771288 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 16:39:25.771761 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.771304 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0048dae9-a5eb-4707-9a78-5385f148fdf1-ovnkube-config\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.771761 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.771342 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/93e0ec36-5590-491f-b620-59d8d420540c-device-dir\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.771761 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.771381 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfzsp\" (UniqueName: \"kubernetes.io/projected/f9857732-ae10-4c66-8e34-589690779e84-kube-api-access-cfzsp\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.771761 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.771419 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-slash\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.771761 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.771449 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-cnibin\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.771761 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.771484 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs\") pod \"network-metrics-daemon-q5b2h\" (UID: \"d85b39e7-4145-4783-a50d-e94999b43e90\") " pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:25.771761 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.771543 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-run-systemd\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.771761 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.771715 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-var-lib-openvswitch\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.771761 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.771762 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-multus-conf-dir\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.772316 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.771901 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx56p\" (UniqueName: \"kubernetes.io/projected/a13ed17d-5b83-44be-8c88-f98632b2ac89-kube-api-access-mx56p\") pod \"node-resolver-z9v4l\" (UID: \"a13ed17d-5b83-44be-8c88-f98632b2ac89\") " pod="openshift-dns/node-resolver-z9v4l" Apr 24 16:39:25.772316 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.771980 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-etc-sysctl-conf\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.772316 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772029 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-run-openvswitch\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.772316 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772062 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpcfw\" (UniqueName: \"kubernetes.io/projected/0048dae9-a5eb-4707-9a78-5385f148fdf1-kube-api-access-jpcfw\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.772316 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772118 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8a2acb5b-dd6f-415a-a081-ae20b03878ff-iptables-alerter-script\") pod \"iptables-alerter-9ldxt\" (UID: \"8a2acb5b-dd6f-415a-a081-ae20b03878ff\") " pod="openshift-network-operator/iptables-alerter-9ldxt" Apr 24 16:39:25.772316 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772171 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a611eeef-0446-421a-b3c5-d38e773087f7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.772316 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772212 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-systemd-units\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.772316 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772259 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-run-netns\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.772316 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772301 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-cni-bin\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.772730 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772335 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0e0b1833-0ea6-4684-8c49-7ad78d75cec2-agent-certs\") pod \"konnectivity-agent-m7ftj\" (UID: \"0e0b1833-0ea6-4684-8c49-7ad78d75cec2\") " pod="kube-system/konnectivity-agent-m7ftj" Apr 24 16:39:25.772730 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772356 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-etc-kubernetes\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.772730 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772572 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8vts\" (UniqueName: \"kubernetes.io/projected/93e0ec36-5590-491f-b620-59d8d420540c-kube-api-access-m8vts\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.772730 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772605 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-etc-sysctl-d\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.772730 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772643 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-etc-systemd\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.772730 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772673 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a611eeef-0446-421a-b3c5-d38e773087f7-os-release\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.772975 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772733 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-host-var-lib-kubelet\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.772975 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772758 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/93e0ec36-5590-491f-b620-59d8d420540c-etc-selinux\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.772975 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772789 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9f551fe-0d59-471b-b35c-3abef14bb13f-cni-binary-copy\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.772975 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772818 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-host-run-k8s-cni-cncf-io\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.772975 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772846 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/93e0ec36-5590-491f-b620-59d8d420540c-registration-dir\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.772975 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772877 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-etc-sysconfig\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.772975 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772900 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-lib-modules\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.772975 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.772975 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-kubelet\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.773341 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773009 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-etc-kubernetes\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.773341 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773041 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-var-lib-kubelet\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.773341 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773102 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a611eeef-0446-421a-b3c5-d38e773087f7-system-cni-dir\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.773341 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773162 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhhwj\" (UniqueName: \"kubernetes.io/projected/d85b39e7-4145-4783-a50d-e94999b43e90-kube-api-access-nhhwj\") pod \"network-metrics-daemon-q5b2h\" (UID: \"d85b39e7-4145-4783-a50d-e94999b43e90\") " pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:25.773341 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773194 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-run-ovn\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.773341 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773245 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-cni-netd\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.773341 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773278 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-multus-cni-dir\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.773341 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773305 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d9f551fe-0d59-471b-b35c-3abef14bb13f-multus-daemon-config\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.773341 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773339 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-etc-modprobe-d\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.773803 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773404 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-sys\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.773803 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773443 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3323ce5-5f82-4b32-8290-e8a47d64634b-host\") pod \"node-ca-vcjb9\" (UID: \"f3323ce5-5f82-4b32-8290-e8a47d64634b\") " pod="openshift-image-registry/node-ca-vcjb9" Apr 24 16:39:25.773803 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773474 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f3323ce5-5f82-4b32-8290-e8a47d64634b-serviceca\") pod \"node-ca-vcjb9\" (UID: \"f3323ce5-5f82-4b32-8290-e8a47d64634b\") " pod="openshift-image-registry/node-ca-vcjb9" Apr 24 16:39:25.773803 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773536 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ctfc\" (UniqueName: \"kubernetes.io/projected/f3323ce5-5f82-4b32-8290-e8a47d64634b-kube-api-access-9ctfc\") pod \"node-ca-vcjb9\" (UID: \"f3323ce5-5f82-4b32-8290-e8a47d64634b\") " pod="openshift-image-registry/node-ca-vcjb9" Apr 24 16:39:25.773803 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773568 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-host-run-netns\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.773803 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773614 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-host-var-lib-cni-multus\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.773803 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773686 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-host-run-multus-certs\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.773803 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773722 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a611eeef-0446-421a-b3c5-d38e773087f7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.773803 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773757 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0048dae9-a5eb-4707-9a78-5385f148fdf1-ovnkube-script-lib\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.773803 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773788 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0e0b1833-0ea6-4684-8c49-7ad78d75cec2-konnectivity-ca\") pod \"konnectivity-agent-m7ftj\" (UID: \"0e0b1833-0ea6-4684-8c49-7ad78d75cec2\") " pod="kube-system/konnectivity-agent-m7ftj" Apr 24 16:39:25.774247 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773829 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-host-var-lib-cni-bin\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.774247 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773860 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vchvr\" (UniqueName: \"kubernetes.io/projected/d9f551fe-0d59-471b-b35c-3abef14bb13f-kube-api-access-vchvr\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.774247 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773888 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8a2acb5b-dd6f-415a-a081-ae20b03878ff-host-slash\") pod \"iptables-alerter-9ldxt\" (UID: \"8a2acb5b-dd6f-415a-a081-ae20b03878ff\") " pod="openshift-network-operator/iptables-alerter-9ldxt" Apr 24 16:39:25.774247 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773948 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/93e0ec36-5590-491f-b620-59d8d420540c-sys-fs\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.774247 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.773971 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-etc-openvswitch\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.774247 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.774000 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0048dae9-a5eb-4707-9a78-5385f148fdf1-ovn-node-metrics-cert\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.774247 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.774030 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93e0ec36-5590-491f-b620-59d8d420540c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.774247 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.774060 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a13ed17d-5b83-44be-8c88-f98632b2ac89-hosts-file\") pod \"node-resolver-z9v4l\" (UID: \"a13ed17d-5b83-44be-8c88-f98632b2ac89\") " pod="openshift-dns/node-resolver-z9v4l" Apr 24 16:39:25.774247 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.774090 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f9857732-ae10-4c66-8e34-589690779e84-etc-tuned\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.774247 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.774156 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4c4z\" (UniqueName: \"kubernetes.io/projected/a611eeef-0446-421a-b3c5-d38e773087f7-kube-api-access-v4c4z\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.776657 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.776636 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:39:25.804467 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.804444 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-g6zrx" Apr 24 16:39:25.818593 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.818534 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-g6zrx" Apr 24 16:39:25.824380 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.824348 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 16:34:24 +0000 UTC" deadline="2028-01-12 01:26:34.43091543 +0000 UTC" Apr 24 16:39:25.824380 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.824376 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15056h47m8.606542123s" Apr 24 16:39:25.852891 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:25.852846 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacbc91bbcdbc790f59d9cba82c01d807.slice/crio-1378785c4a026c431e4e32a38388fc789aaae8ddeb70da70dfc587d719ffce5d WatchSource:0}: Error finding container 1378785c4a026c431e4e32a38388fc789aaae8ddeb70da70dfc587d719ffce5d: Status 404 returned error can't find the container with id 1378785c4a026c431e4e32a38388fc789aaae8ddeb70da70dfc587d719ffce5d Apr 24 16:39:25.853293 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:25.853268 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee8a8bce52f235626e7af91e66c220e.slice/crio-ab93d763c38948df81e63f2f3cc7a9032ebde8980f02665e253cb2d9870e476c WatchSource:0}: Error finding container ab93d763c38948df81e63f2f3cc7a9032ebde8980f02665e253cb2d9870e476c: Status 404 returned error can't find the container with id ab93d763c38948df81e63f2f3cc7a9032ebde8980f02665e253cb2d9870e476c Apr 24 16:39:25.857491 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.857475 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:39:25.864048 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.864027 2581 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 16:39:25.875065 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875044 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-var-lib-kubelet\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.875195 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875072 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a611eeef-0446-421a-b3c5-d38e773087f7-system-cni-dir\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.875195 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875090 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhhwj\" (UniqueName: \"kubernetes.io/projected/d85b39e7-4145-4783-a50d-e94999b43e90-kube-api-access-nhhwj\") pod \"network-metrics-daemon-q5b2h\" (UID: \"d85b39e7-4145-4783-a50d-e94999b43e90\") " pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:25.875195 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875112 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-run-ovn\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.875195 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875136 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-cni-netd\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.875195 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875159 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-multus-cni-dir\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.875195 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875158 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a611eeef-0446-421a-b3c5-d38e773087f7-system-cni-dir\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.875195 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875158 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-var-lib-kubelet\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.875195 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875181 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-run-ovn\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.875604 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875203 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-cni-netd\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.875604 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875207 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-multus-cni-dir\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.875604 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875246 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d9f551fe-0d59-471b-b35c-3abef14bb13f-multus-daemon-config\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.875604 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875292 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-etc-modprobe-d\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.875604 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875315 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-sys\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.875604 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875338 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3323ce5-5f82-4b32-8290-e8a47d64634b-host\") pod \"node-ca-vcjb9\" (UID: \"f3323ce5-5f82-4b32-8290-e8a47d64634b\") " pod="openshift-image-registry/node-ca-vcjb9" Apr 24 16:39:25.875604 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875426 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-etc-modprobe-d\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.875604 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875436 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-sys\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.875604 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875436 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f3323ce5-5f82-4b32-8290-e8a47d64634b-host\") pod \"node-ca-vcjb9\" (UID: \"f3323ce5-5f82-4b32-8290-e8a47d64634b\") " pod="openshift-image-registry/node-ca-vcjb9" Apr 24 16:39:25.875604 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875459 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f3323ce5-5f82-4b32-8290-e8a47d64634b-serviceca\") pod \"node-ca-vcjb9\" (UID: \"f3323ce5-5f82-4b32-8290-e8a47d64634b\") " pod="openshift-image-registry/node-ca-vcjb9" Apr 24 16:39:25.875604 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875525 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ctfc\" (UniqueName: \"kubernetes.io/projected/f3323ce5-5f82-4b32-8290-e8a47d64634b-kube-api-access-9ctfc\") pod \"node-ca-vcjb9\" (UID: \"f3323ce5-5f82-4b32-8290-e8a47d64634b\") " pod="openshift-image-registry/node-ca-vcjb9" Apr 24 16:39:25.875604 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875543 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-host-run-netns\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.875604 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875560 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-host-var-lib-cni-multus\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.875604 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875576 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-host-run-multus-certs\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.875604 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875598 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a611eeef-0446-421a-b3c5-d38e773087f7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.876315 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875612 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-host-run-netns\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.876315 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875621 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-host-var-lib-cni-multus\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.876315 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875624 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0048dae9-a5eb-4707-9a78-5385f148fdf1-ovnkube-script-lib\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.876315 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875661 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-host-run-multus-certs\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.876315 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875666 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0e0b1833-0ea6-4684-8c49-7ad78d75cec2-konnectivity-ca\") pod \"konnectivity-agent-m7ftj\" (UID: \"0e0b1833-0ea6-4684-8c49-7ad78d75cec2\") " pod="kube-system/konnectivity-agent-m7ftj" Apr 24 16:39:25.876315 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875701 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-host-var-lib-cni-bin\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.876315 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875726 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vchvr\" (UniqueName: \"kubernetes.io/projected/d9f551fe-0d59-471b-b35c-3abef14bb13f-kube-api-access-vchvr\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.876315 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875749 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8a2acb5b-dd6f-415a-a081-ae20b03878ff-host-slash\") pod \"iptables-alerter-9ldxt\" (UID: \"8a2acb5b-dd6f-415a-a081-ae20b03878ff\") " pod="openshift-network-operator/iptables-alerter-9ldxt" Apr 24 16:39:25.876315 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875783 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/93e0ec36-5590-491f-b620-59d8d420540c-sys-fs\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.876315 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875809 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-etc-openvswitch\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.876315 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875835 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0048dae9-a5eb-4707-9a78-5385f148fdf1-ovn-node-metrics-cert\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.876315 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875864 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93e0ec36-5590-491f-b620-59d8d420540c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.876315 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875867 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d9f551fe-0d59-471b-b35c-3abef14bb13f-multus-daemon-config\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.876315 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875889 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a13ed17d-5b83-44be-8c88-f98632b2ac89-hosts-file\") pod \"node-resolver-z9v4l\" (UID: \"a13ed17d-5b83-44be-8c88-f98632b2ac89\") " pod="openshift-dns/node-resolver-z9v4l" Apr 24 16:39:25.876315 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875913 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f9857732-ae10-4c66-8e34-589690779e84-etc-tuned\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.876315 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875937 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4c4z\" (UniqueName: \"kubernetes.io/projected/a611eeef-0446-421a-b3c5-d38e773087f7-kube-api-access-v4c4z\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.876315 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875947 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/93e0ec36-5590-491f-b620-59d8d420540c-sys-fs\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.877159 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875963 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-node-log\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.877159 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875987 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-log-socket\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.877159 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875988 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-host-var-lib-cni-bin\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.877159 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875991 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/93e0ec36-5590-491f-b620-59d8d420540c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.877159 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876009 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-run-ovn-kubernetes\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.877159 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876038 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-hostroot\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.877159 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876044 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-etc-openvswitch\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.877159 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876061 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7gbg\" (UniqueName: \"kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg\") pod \"network-check-target-9wjxs\" (UID: \"4fda4ceb-5ea7-4202-903b-a9a5b5152485\") " pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:25.877159 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876083 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-host\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.877159 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876106 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.877159 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876133 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a611eeef-0446-421a-b3c5-d38e773087f7-cnibin\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.877159 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876157 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0048dae9-a5eb-4707-9a78-5385f148fdf1-env-overrides\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.877159 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876181 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-multus-socket-dir-parent\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.877159 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876206 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/93e0ec36-5590-491f-b620-59d8d420540c-socket-dir\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.877159 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876208 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8a2acb5b-dd6f-415a-a081-ae20b03878ff-host-slash\") pod \"iptables-alerter-9ldxt\" (UID: \"8a2acb5b-dd6f-415a-a081-ae20b03878ff\") " pod="openshift-network-operator/iptables-alerter-9ldxt" Apr 24 16:39:25.877159 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876239 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0048dae9-a5eb-4707-9a78-5385f148fdf1-ovnkube-script-lib\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.877159 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876230 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-run\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.878011 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.875964 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f3323ce5-5f82-4b32-8290-e8a47d64634b-serviceca\") pod \"node-ca-vcjb9\" (UID: \"f3323ce5-5f82-4b32-8290-e8a47d64634b\") " pod="openshift-image-registry/node-ca-vcjb9" Apr 24 16:39:25.878011 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876254 2581 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 16:39:25.878011 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876283 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f9857732-ae10-4c66-8e34-589690779e84-tmp\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.878011 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876308 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a611eeef-0446-421a-b3c5-d38e773087f7-cni-binary-copy\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.878011 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876321 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a611eeef-0446-421a-b3c5-d38e773087f7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.878011 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876334 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a611eeef-0446-421a-b3c5-d38e773087f7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.878011 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876338 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-host\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.878011 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876346 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0e0b1833-0ea6-4684-8c49-7ad78d75cec2-konnectivity-ca\") pod \"konnectivity-agent-m7ftj\" (UID: \"0e0b1833-0ea6-4684-8c49-7ad78d75cec2\") " pod="kube-system/konnectivity-agent-m7ftj" Apr 24 16:39:25.878011 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876359 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-system-cni-dir\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.878011 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876382 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a611eeef-0446-421a-b3c5-d38e773087f7-cnibin\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.878011 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876383 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-os-release\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.878011 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876403 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-node-log\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.878011 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876414 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt9xh\" (UniqueName: \"kubernetes.io/projected/8a2acb5b-dd6f-415a-a081-ae20b03878ff-kube-api-access-xt9xh\") pod \"iptables-alerter-9ldxt\" (UID: \"8a2acb5b-dd6f-415a-a081-ae20b03878ff\") " pod="openshift-network-operator/iptables-alerter-9ldxt" Apr 24 16:39:25.878011 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876428 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/93e0ec36-5590-491f-b620-59d8d420540c-socket-dir\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.878011 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876434 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a13ed17d-5b83-44be-8c88-f98632b2ac89-hosts-file\") pod \"node-resolver-z9v4l\" (UID: \"a13ed17d-5b83-44be-8c88-f98632b2ac89\") " pod="openshift-dns/node-resolver-z9v4l" Apr 24 16:39:25.878011 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876458 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.878011 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876480 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-run-ovn-kubernetes\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.878011 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876484 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a13ed17d-5b83-44be-8c88-f98632b2ac89-tmp-dir\") pod \"node-resolver-z9v4l\" (UID: \"a13ed17d-5b83-44be-8c88-f98632b2ac89\") " pod="openshift-dns/node-resolver-z9v4l" Apr 24 16:39:25.878724 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876486 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-multus-socket-dir-parent\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.878724 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876536 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-hostroot\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.878724 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876553 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0048dae9-a5eb-4707-9a78-5385f148fdf1-ovnkube-config\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.878724 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876582 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-run\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.878724 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876623 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-log-socket\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.878724 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876636 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-os-release\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.878724 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876670 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-system-cni-dir\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.878724 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876759 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a611eeef-0446-421a-b3c5-d38e773087f7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.878724 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876811 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a13ed17d-5b83-44be-8c88-f98632b2ac89-tmp-dir\") pod \"node-resolver-z9v4l\" (UID: \"a13ed17d-5b83-44be-8c88-f98632b2ac89\") " pod="openshift-dns/node-resolver-z9v4l" Apr 24 16:39:25.878724 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876937 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/93e0ec36-5590-491f-b620-59d8d420540c-device-dir\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.878724 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876969 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfzsp\" (UniqueName: \"kubernetes.io/projected/f9857732-ae10-4c66-8e34-589690779e84-kube-api-access-cfzsp\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.878724 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.876995 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-slash\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.878724 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877018 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-cnibin\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.878724 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877044 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs\") pod \"network-metrics-daemon-q5b2h\" (UID: \"d85b39e7-4145-4783-a50d-e94999b43e90\") " pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:25.878724 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877038 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/93e0ec36-5590-491f-b620-59d8d420540c-device-dir\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.878724 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877073 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-run-systemd\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.878724 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877104 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-var-lib-openvswitch\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.878724 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877111 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-cnibin\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.879438 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877132 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-multus-conf-dir\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.879438 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877150 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-slash\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.879438 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877161 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx56p\" (UniqueName: \"kubernetes.io/projected/a13ed17d-5b83-44be-8c88-f98632b2ac89-kube-api-access-mx56p\") pod \"node-resolver-z9v4l\" (UID: \"a13ed17d-5b83-44be-8c88-f98632b2ac89\") " pod="openshift-dns/node-resolver-z9v4l" Apr 24 16:39:25.879438 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877171 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-run-systemd\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.879438 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877131 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0048dae9-a5eb-4707-9a78-5385f148fdf1-ovnkube-config\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.879438 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877203 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-multus-conf-dir\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.879438 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877236 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-etc-sysctl-conf\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.879438 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877248 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-var-lib-openvswitch\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.879438 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877272 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-run-openvswitch\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.879438 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877304 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jpcfw\" (UniqueName: \"kubernetes.io/projected/0048dae9-a5eb-4707-9a78-5385f148fdf1-kube-api-access-jpcfw\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.879438 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:25.877320 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:25.879438 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877329 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8a2acb5b-dd6f-415a-a081-ae20b03878ff-iptables-alerter-script\") pod \"iptables-alerter-9ldxt\" (UID: \"8a2acb5b-dd6f-415a-a081-ae20b03878ff\") " pod="openshift-network-operator/iptables-alerter-9ldxt" Apr 24 16:39:25.879438 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877329 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0048dae9-a5eb-4707-9a78-5385f148fdf1-env-overrides\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.879438 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877361 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a611eeef-0446-421a-b3c5-d38e773087f7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.879438 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877397 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-etc-sysctl-conf\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.879438 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:25.877408 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs podName:d85b39e7-4145-4783-a50d-e94999b43e90 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:26.377373602 +0000 UTC m=+2.063097830 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs") pod "network-metrics-daemon-q5b2h" (UID: "d85b39e7-4145-4783-a50d-e94999b43e90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:25.879438 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877454 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-systemd-units\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.879957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877487 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-run-netns\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.879957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877531 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-cni-bin\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.879957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877583 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-run-openvswitch\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.879957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877590 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0e0b1833-0ea6-4684-8c49-7ad78d75cec2-agent-certs\") pod \"konnectivity-agent-m7ftj\" (UID: \"0e0b1833-0ea6-4684-8c49-7ad78d75cec2\") " pod="kube-system/konnectivity-agent-m7ftj" Apr 24 16:39:25.879957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877668 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-systemd-units\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.879957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877678 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-etc-kubernetes\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.879957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877708 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8vts\" (UniqueName: \"kubernetes.io/projected/93e0ec36-5590-491f-b620-59d8d420540c-kube-api-access-m8vts\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.879957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877715 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-run-netns\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.879957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877735 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-etc-sysctl-d\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.879957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877757 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-etc-kubernetes\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.879957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877784 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-etc-systemd\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.879957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877824 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a611eeef-0446-421a-b3c5-d38e773087f7-os-release\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.879957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877851 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-host-var-lib-kubelet\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.879957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877875 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/93e0ec36-5590-491f-b620-59d8d420540c-etc-selinux\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.879957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877893 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a611eeef-0446-421a-b3c5-d38e773087f7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.879957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.877917 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9f551fe-0d59-471b-b35c-3abef14bb13f-cni-binary-copy\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.879957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878003 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-cni-bin\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.880608 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878034 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-host-run-k8s-cni-cncf-io\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.880608 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878056 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a611eeef-0446-421a-b3c5-d38e773087f7-os-release\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.880608 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878078 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/93e0ec36-5590-491f-b620-59d8d420540c-etc-selinux\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.880608 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878114 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/93e0ec36-5590-491f-b620-59d8d420540c-registration-dir\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.880608 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878135 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-host-var-lib-kubelet\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.880608 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878188 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-etc-sysconfig\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.880608 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878194 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-etc-sysctl-d\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.880608 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878238 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d9f551fe-0d59-471b-b35c-3abef14bb13f-host-run-k8s-cni-cncf-io\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.880608 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878261 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-lib-modules\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.880608 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878290 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-kubelet\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.880608 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878295 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-etc-sysconfig\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.880608 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878315 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-etc-kubernetes\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.880608 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878348 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/93e0ec36-5590-491f-b620-59d8d420540c-registration-dir\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.880608 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878189 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-etc-systemd\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.880608 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878409 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0048dae9-a5eb-4707-9a78-5385f148fdf1-host-kubelet\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.880608 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878485 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-etc-kubernetes\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.880608 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878557 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8a2acb5b-dd6f-415a-a081-ae20b03878ff-iptables-alerter-script\") pod \"iptables-alerter-9ldxt\" (UID: \"8a2acb5b-dd6f-415a-a081-ae20b03878ff\") " pod="openshift-network-operator/iptables-alerter-9ldxt" Apr 24 16:39:25.881057 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878591 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9857732-ae10-4c66-8e34-589690779e84-lib-modules\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.881057 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878695 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9f551fe-0d59-471b-b35c-3abef14bb13f-cni-binary-copy\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.881057 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.878764 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a611eeef-0446-421a-b3c5-d38e773087f7-cni-binary-copy\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.881057 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.879218 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f9857732-ae10-4c66-8e34-589690779e84-etc-tuned\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.881057 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.879526 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0048dae9-a5eb-4707-9a78-5385f148fdf1-ovn-node-metrics-cert\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:25.881057 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.880342 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f9857732-ae10-4c66-8e34-589690779e84-tmp\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.881057 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.880427 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0e0b1833-0ea6-4684-8c49-7ad78d75cec2-agent-certs\") pod \"konnectivity-agent-m7ftj\" (UID: \"0e0b1833-0ea6-4684-8c49-7ad78d75cec2\") " pod="kube-system/konnectivity-agent-m7ftj" Apr 24 16:39:25.884144 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.884097 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal" event={"ID":"aee8a8bce52f235626e7af91e66c220e","Type":"ContainerStarted","Data":"ab93d763c38948df81e63f2f3cc7a9032ebde8980f02665e253cb2d9870e476c"} Apr 24 16:39:25.885040 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.885020 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-104.ec2.internal" event={"ID":"acbc91bbcdbc790f59d9cba82c01d807","Type":"ContainerStarted","Data":"1378785c4a026c431e4e32a38388fc789aaae8ddeb70da70dfc587d719ffce5d"} Apr 24 16:39:25.893357 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:25.893170 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:25.893357 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:25.893203 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:25.893357 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:25.893218 2581 projected.go:194] Error preparing data for projected volume kube-api-access-l7gbg for pod openshift-network-diagnostics/network-check-target-9wjxs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:25.893357 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:25.893293 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg podName:4fda4ceb-5ea7-4202-903b-a9a5b5152485 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:26.393272084 +0000 UTC m=+2.078996331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-l7gbg" (UniqueName: "kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg") pod "network-check-target-9wjxs" (UID: "4fda4ceb-5ea7-4202-903b-a9a5b5152485") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:25.895817 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.895792 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt9xh\" (UniqueName: \"kubernetes.io/projected/8a2acb5b-dd6f-415a-a081-ae20b03878ff-kube-api-access-xt9xh\") pod \"iptables-alerter-9ldxt\" (UID: \"8a2acb5b-dd6f-415a-a081-ae20b03878ff\") " pod="openshift-network-operator/iptables-alerter-9ldxt" Apr 24 16:39:25.895817 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.895808 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhhwj\" (UniqueName: \"kubernetes.io/projected/d85b39e7-4145-4783-a50d-e94999b43e90-kube-api-access-nhhwj\") pod \"network-metrics-daemon-q5b2h\" (UID: \"d85b39e7-4145-4783-a50d-e94999b43e90\") " pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:25.895817 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.895816 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4c4z\" (UniqueName: \"kubernetes.io/projected/a611eeef-0446-421a-b3c5-d38e773087f7-kube-api-access-v4c4z\") pod \"multus-additional-cni-plugins-7xj6k\" (UID: \"a611eeef-0446-421a-b3c5-d38e773087f7\") " pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:25.896702 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.896580 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ctfc\" (UniqueName: \"kubernetes.io/projected/f3323ce5-5f82-4b32-8290-e8a47d64634b-kube-api-access-9ctfc\") pod \"node-ca-vcjb9\" (UID: \"f3323ce5-5f82-4b32-8290-e8a47d64634b\") " pod="openshift-image-registry/node-ca-vcjb9" Apr 24 16:39:25.896702 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.896601 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx56p\" (UniqueName: \"kubernetes.io/projected/a13ed17d-5b83-44be-8c88-f98632b2ac89-kube-api-access-mx56p\") pod \"node-resolver-z9v4l\" (UID: \"a13ed17d-5b83-44be-8c88-f98632b2ac89\") " pod="openshift-dns/node-resolver-z9v4l" Apr 24 16:39:25.896852 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.896814 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vchvr\" (UniqueName: \"kubernetes.io/projected/d9f551fe-0d59-471b-b35c-3abef14bb13f-kube-api-access-vchvr\") pod \"multus-8vprt\" (UID: \"d9f551fe-0d59-471b-b35c-3abef14bb13f\") " pod="openshift-multus/multus-8vprt" Apr 24 16:39:25.897051 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.897031 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfzsp\" (UniqueName: \"kubernetes.io/projected/f9857732-ae10-4c66-8e34-589690779e84-kube-api-access-cfzsp\") pod \"tuned-rx464\" (UID: \"f9857732-ae10-4c66-8e34-589690779e84\") " pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:25.898635 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.898619 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8vts\" (UniqueName: \"kubernetes.io/projected/93e0ec36-5590-491f-b620-59d8d420540c-kube-api-access-m8vts\") pod \"aws-ebs-csi-driver-node-cmstc\" (UID: \"93e0ec36-5590-491f-b620-59d8d420540c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:25.898635 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:25.898630 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpcfw\" (UniqueName: \"kubernetes.io/projected/0048dae9-a5eb-4707-9a78-5385f148fdf1-kube-api-access-jpcfw\") pod \"ovnkube-node-k59gs\" (UID: \"0048dae9-a5eb-4707-9a78-5385f148fdf1\") " pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:26.089976 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.089896 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-z9v4l" Apr 24 16:39:26.096393 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:26.096370 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda13ed17d_5b83_44be_8c88_f98632b2ac89.slice/crio-2a7c8c0d04b508bbb5385dfbede4e2253c45f77f23266bebe7586200df0ee713 WatchSource:0}: Error finding container 2a7c8c0d04b508bbb5385dfbede4e2253c45f77f23266bebe7586200df0ee713: Status 404 returned error can't find the container with id 2a7c8c0d04b508bbb5385dfbede4e2253c45f77f23266bebe7586200df0ee713 Apr 24 16:39:26.099232 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.099213 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-m7ftj" Apr 24 16:39:26.105792 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:26.105770 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e0b1833_0ea6_4684_8c49_7ad78d75cec2.slice/crio-533c5d5be2a9fceee09f485bf980f15917ed1aef0ac207f0c6ccb2bc5034de1d WatchSource:0}: Error finding container 533c5d5be2a9fceee09f485bf980f15917ed1aef0ac207f0c6ccb2bc5034de1d: Status 404 returned error can't find the container with id 533c5d5be2a9fceee09f485bf980f15917ed1aef0ac207f0c6ccb2bc5034de1d Apr 24 16:39:26.111578 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.111557 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rx464" Apr 24 16:39:26.118171 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:26.118149 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9857732_ae10_4c66_8e34_589690779e84.slice/crio-0b25bd72b9c7b77e90459f96bab83238d8483bd99ca4f85244b774fc8c2c8ec0 WatchSource:0}: Error finding container 0b25bd72b9c7b77e90459f96bab83238d8483bd99ca4f85244b774fc8c2c8ec0: Status 404 returned error can't find the container with id 0b25bd72b9c7b77e90459f96bab83238d8483bd99ca4f85244b774fc8c2c8ec0 Apr 24 16:39:26.127821 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.127802 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7xj6k" Apr 24 16:39:26.133706 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:26.133676 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda611eeef_0446_421a_b3c5_d38e773087f7.slice/crio-8072700e210609dc00f3c87d6c176eb8af7e5ea3b315537f2eba51c9a508231c WatchSource:0}: Error finding container 8072700e210609dc00f3c87d6c176eb8af7e5ea3b315537f2eba51c9a508231c: Status 404 returned error can't find the container with id 8072700e210609dc00f3c87d6c176eb8af7e5ea3b315537f2eba51c9a508231c Apr 24 16:39:26.140881 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.140861 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:26.146452 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:26.146428 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0048dae9_a5eb_4707_9a78_5385f148fdf1.slice/crio-ab02722f029d48a9909883a11d207bd58369deb4f537e53969b69f96f8af79eb WatchSource:0}: Error finding container ab02722f029d48a9909883a11d207bd58369deb4f537e53969b69f96f8af79eb: Status 404 returned error can't find the container with id ab02722f029d48a9909883a11d207bd58369deb4f537e53969b69f96f8af79eb Apr 24 16:39:26.150411 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.150392 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vcjb9" Apr 24 16:39:26.155237 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.155216 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8vprt" Apr 24 16:39:26.161286 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:26.161131 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9f551fe_0d59_471b_b35c_3abef14bb13f.slice/crio-6a0df0f3ce1046f49ae31356eb4d8d940e63cf3bd182e4c836dee64969f5a7c7 WatchSource:0}: Error finding container 6a0df0f3ce1046f49ae31356eb4d8d940e63cf3bd182e4c836dee64969f5a7c7: Status 404 returned error can't find the container with id 6a0df0f3ce1046f49ae31356eb4d8d940e63cf3bd182e4c836dee64969f5a7c7 Apr 24 16:39:26.161828 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.161809 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9ldxt" Apr 24 16:39:26.167134 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.167116 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" Apr 24 16:39:26.168644 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:26.168620 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a2acb5b_dd6f_415a_a081_ae20b03878ff.slice/crio-8ac0e545d035cecdf29ee54e912a506dc8c749b786cd8b5225ea2b61a4866d31 WatchSource:0}: Error finding container 8ac0e545d035cecdf29ee54e912a506dc8c749b786cd8b5225ea2b61a4866d31: Status 404 returned error can't find the container with id 8ac0e545d035cecdf29ee54e912a506dc8c749b786cd8b5225ea2b61a4866d31 Apr 24 16:39:26.174099 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:39:26.174074 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93e0ec36_5590_491f_b620_59d8d420540c.slice/crio-f38b0eff4b51c05e9d6d46b73c6c0af067f6f473c9a5bdc8c7d181e62d0aab08 WatchSource:0}: Error finding container f38b0eff4b51c05e9d6d46b73c6c0af067f6f473c9a5bdc8c7d181e62d0aab08: Status 404 returned error can't find the container with id f38b0eff4b51c05e9d6d46b73c6c0af067f6f473c9a5bdc8c7d181e62d0aab08 Apr 24 16:39:26.282084 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.282056 2581 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:26.323491 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.323461 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:26.381366 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.381237 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs\") pod \"network-metrics-daemon-q5b2h\" (UID: \"d85b39e7-4145-4783-a50d-e94999b43e90\") " pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:26.381530 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:26.381399 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:26.381530 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:26.381468 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs podName:d85b39e7-4145-4783-a50d-e94999b43e90 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:27.381447181 +0000 UTC m=+3.067171431 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs") pod "network-metrics-daemon-q5b2h" (UID: "d85b39e7-4145-4783-a50d-e94999b43e90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:26.482276 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.482012 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7gbg\" (UniqueName: \"kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg\") pod \"network-check-target-9wjxs\" (UID: \"4fda4ceb-5ea7-4202-903b-a9a5b5152485\") " pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:26.482276 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:26.482173 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:26.482276 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:26.482195 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:26.482276 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:26.482207 2581 projected.go:194] Error preparing data for projected volume kube-api-access-l7gbg for pod openshift-network-diagnostics/network-check-target-9wjxs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:26.482276 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:26.482266 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg podName:4fda4ceb-5ea7-4202-903b-a9a5b5152485 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:27.482248385 +0000 UTC m=+3.167972609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7gbg" (UniqueName: "kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg") pod "network-check-target-9wjxs" (UID: "4fda4ceb-5ea7-4202-903b-a9a5b5152485") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:26.561493 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.561366 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:26.819629 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.819546 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:34:25 +0000 UTC" deadline="2027-10-09 13:16:31.318746434 +0000 UTC" Apr 24 16:39:26.819629 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.819582 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12788h37m4.4991693s" Apr 24 16:39:26.896951 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.896902 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" event={"ID":"93e0ec36-5590-491f-b620-59d8d420540c","Type":"ContainerStarted","Data":"f38b0eff4b51c05e9d6d46b73c6c0af067f6f473c9a5bdc8c7d181e62d0aab08"} Apr 24 16:39:26.902360 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.902277 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8vprt" event={"ID":"d9f551fe-0d59-471b-b35c-3abef14bb13f","Type":"ContainerStarted","Data":"6a0df0f3ce1046f49ae31356eb4d8d940e63cf3bd182e4c836dee64969f5a7c7"} Apr 24 16:39:26.914231 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.913969 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rx464" event={"ID":"f9857732-ae10-4c66-8e34-589690779e84","Type":"ContainerStarted","Data":"0b25bd72b9c7b77e90459f96bab83238d8483bd99ca4f85244b774fc8c2c8ec0"} Apr 24 16:39:26.931226 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.931167 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-z9v4l" event={"ID":"a13ed17d-5b83-44be-8c88-f98632b2ac89","Type":"ContainerStarted","Data":"2a7c8c0d04b508bbb5385dfbede4e2253c45f77f23266bebe7586200df0ee713"} Apr 24 16:39:26.942617 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.942557 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9ldxt" event={"ID":"8a2acb5b-dd6f-415a-a081-ae20b03878ff","Type":"ContainerStarted","Data":"8ac0e545d035cecdf29ee54e912a506dc8c749b786cd8b5225ea2b61a4866d31"} Apr 24 16:39:26.949947 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.949874 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vcjb9" event={"ID":"f3323ce5-5f82-4b32-8290-e8a47d64634b","Type":"ContainerStarted","Data":"a072f1b69345b4748673a68e14de40f74f240374ff0127d38f9a57a03930da71"} Apr 24 16:39:26.968019 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.967947 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" event={"ID":"0048dae9-a5eb-4707-9a78-5385f148fdf1","Type":"ContainerStarted","Data":"ab02722f029d48a9909883a11d207bd58369deb4f537e53969b69f96f8af79eb"} Apr 24 16:39:26.992987 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:26.992951 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7xj6k" event={"ID":"a611eeef-0446-421a-b3c5-d38e773087f7","Type":"ContainerStarted","Data":"8072700e210609dc00f3c87d6c176eb8af7e5ea3b315537f2eba51c9a508231c"} Apr 24 16:39:27.010797 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:27.009560 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-m7ftj" event={"ID":"0e0b1833-0ea6-4684-8c49-7ad78d75cec2","Type":"ContainerStarted","Data":"533c5d5be2a9fceee09f485bf980f15917ed1aef0ac207f0c6ccb2bc5034de1d"} Apr 24 16:39:27.389968 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:27.389928 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs\") pod \"network-metrics-daemon-q5b2h\" (UID: \"d85b39e7-4145-4783-a50d-e94999b43e90\") " pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:27.393842 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:27.393814 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:27.393991 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:27.393909 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs podName:d85b39e7-4145-4783-a50d-e94999b43e90 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:29.393887777 +0000 UTC m=+5.079612003 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs") pod "network-metrics-daemon-q5b2h" (UID: "d85b39e7-4145-4783-a50d-e94999b43e90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:27.491005 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:27.490963 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7gbg\" (UniqueName: \"kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg\") pod \"network-check-target-9wjxs\" (UID: \"4fda4ceb-5ea7-4202-903b-a9a5b5152485\") " pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:27.491186 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:27.491139 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:27.491186 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:27.491159 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:27.491186 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:27.491171 2581 projected.go:194] Error preparing data for projected volume kube-api-access-l7gbg for pod openshift-network-diagnostics/network-check-target-9wjxs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:27.491337 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:27.491231 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg podName:4fda4ceb-5ea7-4202-903b-a9a5b5152485 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:29.491212338 +0000 UTC m=+5.176936581 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7gbg" (UniqueName: "kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg") pod "network-check-target-9wjxs" (UID: "4fda4ceb-5ea7-4202-903b-a9a5b5152485") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:27.820461 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:27.820412 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:34:25 +0000 UTC" deadline="2028-01-17 23:58:23.483984194 +0000 UTC" Apr 24 16:39:27.820461 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:27.820454 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15199h18m55.663533423s" Apr 24 16:39:27.881321 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:27.881279 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:27.881485 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:27.881426 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:39:27.881888 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:27.881860 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:27.881972 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:27.881954 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:39:28.199732 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:28.199653 2581 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:29.405557 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:29.405489 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs\") pod \"network-metrics-daemon-q5b2h\" (UID: \"d85b39e7-4145-4783-a50d-e94999b43e90\") " pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:29.406017 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:29.405754 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:29.406017 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:29.405826 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs podName:d85b39e7-4145-4783-a50d-e94999b43e90 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:33.405806276 +0000 UTC m=+9.091530497 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs") pod "network-metrics-daemon-q5b2h" (UID: "d85b39e7-4145-4783-a50d-e94999b43e90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:29.507158 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:29.506536 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7gbg\" (UniqueName: \"kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg\") pod \"network-check-target-9wjxs\" (UID: \"4fda4ceb-5ea7-4202-903b-a9a5b5152485\") " pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:29.507158 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:29.506731 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:29.507158 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:29.506749 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:29.507158 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:29.506763 2581 projected.go:194] Error preparing data for projected volume kube-api-access-l7gbg for pod openshift-network-diagnostics/network-check-target-9wjxs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:29.507158 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:29.506820 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg podName:4fda4ceb-5ea7-4202-903b-a9a5b5152485 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:33.506801243 +0000 UTC m=+9.192525470 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7gbg" (UniqueName: "kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg") pod "network-check-target-9wjxs" (UID: "4fda4ceb-5ea7-4202-903b-a9a5b5152485") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:29.881947 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:29.881911 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:29.882129 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:29.881911 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:29.882129 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:29.882054 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:39:29.882129 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:29.882074 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:39:31.881477 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:31.881433 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:31.882039 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:31.881433 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:31.882039 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:31.881602 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:39:31.882039 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:31.881664 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:39:33.440125 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:33.439984 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs\") pod \"network-metrics-daemon-q5b2h\" (UID: \"d85b39e7-4145-4783-a50d-e94999b43e90\") " pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:33.440651 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:33.440149 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:33.440651 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:33.440227 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs podName:d85b39e7-4145-4783-a50d-e94999b43e90 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:41.440205662 +0000 UTC m=+17.125929898 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs") pod "network-metrics-daemon-q5b2h" (UID: "d85b39e7-4145-4783-a50d-e94999b43e90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:33.540584 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:33.540549 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7gbg\" (UniqueName: \"kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg\") pod \"network-check-target-9wjxs\" (UID: \"4fda4ceb-5ea7-4202-903b-a9a5b5152485\") " pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:33.540783 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:33.540764 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:33.540851 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:33.540789 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:33.540851 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:33.540803 2581 projected.go:194] Error preparing data for projected volume kube-api-access-l7gbg for pod openshift-network-diagnostics/network-check-target-9wjxs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:33.540942 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:33.540863 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg podName:4fda4ceb-5ea7-4202-903b-a9a5b5152485 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:41.540843592 +0000 UTC m=+17.226567831 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7gbg" (UniqueName: "kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg") pod "network-check-target-9wjxs" (UID: "4fda4ceb-5ea7-4202-903b-a9a5b5152485") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:33.881468 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:33.881394 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:33.881662 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:33.881543 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:39:33.881662 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:33.881394 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:33.881662 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:33.881651 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:39:35.881575 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:35.881543 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:35.882044 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:35.881546 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:35.882044 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:35.881654 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:39:35.882044 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:35.881737 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:39:37.881487 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:37.881449 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:37.882022 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:37.881449 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:37.882022 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:37.881589 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:39:37.882022 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:37.881722 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:39:39.881347 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:39.881312 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:39.881347 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:39.881343 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:39.881860 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:39.881442 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:39:39.881860 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:39.881564 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:39:41.497746 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:41.497709 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs\") pod \"network-metrics-daemon-q5b2h\" (UID: \"d85b39e7-4145-4783-a50d-e94999b43e90\") " pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:41.498166 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:41.497832 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:41.498166 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:41.497906 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs podName:d85b39e7-4145-4783-a50d-e94999b43e90 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:57.497890688 +0000 UTC m=+33.183614910 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs") pod "network-metrics-daemon-q5b2h" (UID: "d85b39e7-4145-4783-a50d-e94999b43e90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:41.598260 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:41.598225 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7gbg\" (UniqueName: \"kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg\") pod \"network-check-target-9wjxs\" (UID: \"4fda4ceb-5ea7-4202-903b-a9a5b5152485\") " pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:41.598484 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:41.598405 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:41.598484 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:41.598426 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:41.598484 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:41.598436 2581 projected.go:194] Error preparing data for projected volume kube-api-access-l7gbg for pod openshift-network-diagnostics/network-check-target-9wjxs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:41.598624 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:41.598494 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg podName:4fda4ceb-5ea7-4202-903b-a9a5b5152485 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:57.598473546 +0000 UTC m=+33.284197769 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7gbg" (UniqueName: "kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg") pod "network-check-target-9wjxs" (UID: "4fda4ceb-5ea7-4202-903b-a9a5b5152485") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:41.881081 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:41.880993 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:41.881242 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:41.880993 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:41.881242 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:41.881135 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:39:41.881242 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:41.881226 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:39:43.881524 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:43.881473 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:43.881874 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:43.881482 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:43.881874 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:43.881614 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:39:43.881874 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:43.881667 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:39:45.044878 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:45.044647 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8vprt" event={"ID":"d9f551fe-0d59-471b-b35c-3abef14bb13f","Type":"ContainerStarted","Data":"14f903f7234ed871ef3452a3ddd0fd2380aee82bde1a56ee6560fac22ba35943"} Apr 24 16:39:45.046474 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:45.046354 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rx464" event={"ID":"f9857732-ae10-4c66-8e34-589690779e84","Type":"ContainerStarted","Data":"b015474876441be056a0764f23c0ae94de4103d35900ef420e7a75f0cc8c856e"} Apr 24 16:39:45.047826 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:45.047777 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-104.ec2.internal" event={"ID":"acbc91bbcdbc790f59d9cba82c01d807","Type":"ContainerStarted","Data":"2df0c6bced5b01c712dcc961e427a37643a28155630e9dde76a2fa173f5d8682"} Apr 24 16:39:45.050428 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:45.050412 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 16:39:45.050759 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:45.050737 2581 generic.go:358] "Generic (PLEG): container finished" podID="0048dae9-a5eb-4707-9a78-5385f148fdf1" containerID="eecc8cb819c9e53fd1dfab661f8dd20128f1486beffe8b46c1a66b9c0f486a7d" exitCode=1 Apr 24 16:39:45.050853 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:45.050770 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" event={"ID":"0048dae9-a5eb-4707-9a78-5385f148fdf1","Type":"ContainerStarted","Data":"89ef3c8940b1fc10e04fb77b44678a75577cb420875ba8a35ea7bd51a9751d4a"} Apr 24 16:39:45.050853 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:45.050784 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" event={"ID":"0048dae9-a5eb-4707-9a78-5385f148fdf1","Type":"ContainerStarted","Data":"72e578fd7e65f10fb744e2695c98f1dfcf9ae9ba938eac159d27b6bef307a9ba"} Apr 24 16:39:45.050853 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:45.050797 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" event={"ID":"0048dae9-a5eb-4707-9a78-5385f148fdf1","Type":"ContainerStarted","Data":"b85d7edde123b0a21f3b818a5327cd6d788043b41dfe7fe1b89790c865a401a9"} Apr 24 16:39:45.050853 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:45.050809 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" event={"ID":"0048dae9-a5eb-4707-9a78-5385f148fdf1","Type":"ContainerStarted","Data":"7553c679f536417ecf89f647444fb6f9c7f79a1af35527b0119fb8f7a94a9e86"} Apr 24 16:39:45.050853 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:45.050822 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" event={"ID":"0048dae9-a5eb-4707-9a78-5385f148fdf1","Type":"ContainerDied","Data":"eecc8cb819c9e53fd1dfab661f8dd20128f1486beffe8b46c1a66b9c0f486a7d"} Apr 24 16:39:45.050853 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:45.050836 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" event={"ID":"0048dae9-a5eb-4707-9a78-5385f148fdf1","Type":"ContainerStarted","Data":"8fdc5211e91c8a31b4a46ca38e70750bdf024060de9cba949d911d18b74e24a8"} Apr 24 16:39:45.066255 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:45.066201 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8vprt" podStartSLOduration=1.7937247699999999 podStartE2EDuration="20.066183732s" podCreationTimestamp="2026-04-24 16:39:25 +0000 UTC" firstStartedPulling="2026-04-24 16:39:26.162650607 +0000 UTC m=+1.848374828" lastFinishedPulling="2026-04-24 16:39:44.435109556 +0000 UTC m=+20.120833790" observedRunningTime="2026-04-24 16:39:45.06575184 +0000 UTC m=+20.751476082" watchObservedRunningTime="2026-04-24 16:39:45.066183732 +0000 UTC m=+20.751907977" Apr 24 16:39:45.079654 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:45.079611 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-104.ec2.internal" podStartSLOduration=20.079599561 podStartE2EDuration="20.079599561s" podCreationTimestamp="2026-04-24 16:39:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:39:45.0792013 +0000 UTC m=+20.764925542" watchObservedRunningTime="2026-04-24 16:39:45.079599561 +0000 UTC m=+20.765323804" Apr 24 16:39:45.097885 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:45.097829 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rx464" podStartSLOduration=2.099450308 podStartE2EDuration="20.097811123s" podCreationTimestamp="2026-04-24 16:39:25 +0000 UTC" firstStartedPulling="2026-04-24 16:39:26.119658495 +0000 UTC m=+1.805382716" lastFinishedPulling="2026-04-24 16:39:44.118019295 +0000 UTC m=+19.803743531" observedRunningTime="2026-04-24 16:39:45.096923481 +0000 UTC m=+20.782647730" watchObservedRunningTime="2026-04-24 16:39:45.097811123 +0000 UTC m=+20.783535366" Apr 24 16:39:45.881772 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:45.881743 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:45.881904 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:45.881851 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:39:45.881939 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:45.881912 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:45.882047 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:45.882029 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:39:46.014441 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.014418 2581 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 16:39:46.053782 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.053753 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9ldxt" event={"ID":"8a2acb5b-dd6f-415a-a081-ae20b03878ff","Type":"ContainerStarted","Data":"708d59de8a70cb809f5f2614cc16fb6bd0f8ed347661000e88612c207da9ce54"} Apr 24 16:39:46.055008 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.054984 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vcjb9" event={"ID":"f3323ce5-5f82-4b32-8290-e8a47d64634b","Type":"ContainerStarted","Data":"a1394e21cec1cddca44e9c88c8811a51c6ae684648da25edfa0e63958ea8bfc1"} Apr 24 16:39:46.056216 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.056196 2581 generic.go:358] "Generic (PLEG): container finished" podID="a611eeef-0446-421a-b3c5-d38e773087f7" containerID="6c79b284422f227648a3633db5bca6c2cf3139bc58c8c2f0f253315a21196bd9" exitCode=0 Apr 24 16:39:46.056287 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.056255 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7xj6k" event={"ID":"a611eeef-0446-421a-b3c5-d38e773087f7","Type":"ContainerDied","Data":"6c79b284422f227648a3633db5bca6c2cf3139bc58c8c2f0f253315a21196bd9"} Apr 24 16:39:46.057622 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.057587 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-m7ftj" event={"ID":"0e0b1833-0ea6-4684-8c49-7ad78d75cec2","Type":"ContainerStarted","Data":"1338af8e36a6cceb6d3ffa015f3ffd50b9173d7abc72977a90c06933f5873b36"} Apr 24 16:39:46.059010 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.058991 2581 generic.go:358] "Generic (PLEG): container finished" podID="aee8a8bce52f235626e7af91e66c220e" containerID="28f41ef2eda565e65c7310003997eebb6f40156867f072d27a91ebf7cd152a7f" exitCode=0 Apr 24 16:39:46.059078 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.059055 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal" event={"ID":"aee8a8bce52f235626e7af91e66c220e","Type":"ContainerDied","Data":"28f41ef2eda565e65c7310003997eebb6f40156867f072d27a91ebf7cd152a7f"} Apr 24 16:39:46.059208 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.059191 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal" Apr 24 16:39:46.060572 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.060554 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" event={"ID":"93e0ec36-5590-491f-b620-59d8d420540c","Type":"ContainerStarted","Data":"aa6eb5c3c8b6cb839fd533b6538e9f494584bb37a692728d7cb81f2939d75608"} Apr 24 16:39:46.060657 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.060575 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" event={"ID":"93e0ec36-5590-491f-b620-59d8d420540c","Type":"ContainerStarted","Data":"c6cd250b96035a32396138f0f1e8e550fd64d9b9c148e02655329e4307d5f566"} Apr 24 16:39:46.061706 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.061685 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-z9v4l" event={"ID":"a13ed17d-5b83-44be-8c88-f98632b2ac89","Type":"ContainerStarted","Data":"fe1648ddb545a43d14b4a72cbf7d860261464944eb738098a35b198cec57ade0"} Apr 24 16:39:46.073145 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.073124 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:39:46.073813 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.073795 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal"] Apr 24 16:39:46.074406 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.074372 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-9ldxt" podStartSLOduration=3.128438908 podStartE2EDuration="21.074361456s" podCreationTimestamp="2026-04-24 16:39:25 +0000 UTC" firstStartedPulling="2026-04-24 16:39:26.170661934 +0000 UTC m=+1.856386155" lastFinishedPulling="2026-04-24 16:39:44.116584468 +0000 UTC m=+19.802308703" observedRunningTime="2026-04-24 16:39:46.073872112 +0000 UTC m=+21.759596352" watchObservedRunningTime="2026-04-24 16:39:46.074361456 +0000 UTC m=+21.760085699" Apr 24 16:39:46.100756 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.100716 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-m7ftj" podStartSLOduration=4.091204234 podStartE2EDuration="22.10070382s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="2026-04-24 16:39:26.107190306 +0000 UTC m=+1.792914527" lastFinishedPulling="2026-04-24 16:39:44.116689883 +0000 UTC m=+19.802414113" observedRunningTime="2026-04-24 16:39:46.100040154 +0000 UTC m=+21.785764411" watchObservedRunningTime="2026-04-24 16:39:46.10070382 +0000 UTC m=+21.786428065" Apr 24 16:39:46.150326 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.150246 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-z9v4l" podStartSLOduration=4.131724785 podStartE2EDuration="22.15023164s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="2026-04-24 16:39:26.09789784 +0000 UTC m=+1.783622060" lastFinishedPulling="2026-04-24 16:39:44.11640468 +0000 UTC m=+19.802128915" observedRunningTime="2026-04-24 16:39:46.149916135 +0000 UTC m=+21.835640378" watchObservedRunningTime="2026-04-24 16:39:46.15023164 +0000 UTC m=+21.835955883" Apr 24 16:39:46.150474 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.150357 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vcjb9" podStartSLOduration=3.191759188 podStartE2EDuration="21.150350075s" podCreationTimestamp="2026-04-24 16:39:25 +0000 UTC" firstStartedPulling="2026-04-24 16:39:26.158128998 +0000 UTC m=+1.843853234" lastFinishedPulling="2026-04-24 16:39:44.116719883 +0000 UTC m=+19.802444121" observedRunningTime="2026-04-24 16:39:46.134564576 +0000 UTC m=+21.820288818" watchObservedRunningTime="2026-04-24 16:39:46.150350075 +0000 UTC m=+21.836074321" Apr 24 16:39:46.831697 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.831352 2581 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T16:39:46.01443521Z","UUID":"8a3e1e1e-4ee5-4615-bf17-bb29d48251c6","Handler":null,"Name":"","Endpoint":""} Apr 24 16:39:46.834276 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.834255 2581 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 16:39:46.834384 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:46.834285 2581 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 16:39:47.067168 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:47.067099 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 16:39:47.067634 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:47.067588 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" event={"ID":"0048dae9-a5eb-4707-9a78-5385f148fdf1","Type":"ContainerStarted","Data":"04b32682d8d071e86ea15a97691871bbd9795f0b174c5f46a8d22be215e0e4fb"} Apr 24 16:39:47.069306 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:47.069282 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal" event={"ID":"aee8a8bce52f235626e7af91e66c220e","Type":"ContainerStarted","Data":"d6504274d6a3b93c41bdb7edcbb8d6c72326c140c843210f3492a3605e95b3bf"} Apr 24 16:39:47.071407 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:47.071383 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" event={"ID":"93e0ec36-5590-491f-b620-59d8d420540c","Type":"ContainerStarted","Data":"451c98a1edaf09a56d4392ccc852b7df8b91bfe1cf8d9a8f58a5d48e1b540d00"} Apr 24 16:39:47.085270 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:47.085227 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-104.ec2.internal" podStartSLOduration=1.085214502 podStartE2EDuration="1.085214502s" podCreationTimestamp="2026-04-24 16:39:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:39:47.085083452 +0000 UTC m=+22.770807698" watchObservedRunningTime="2026-04-24 16:39:47.085214502 +0000 UTC m=+22.770938745" Apr 24 16:39:47.109280 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:47.109227 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cmstc" podStartSLOduration=1.55771674 podStartE2EDuration="22.109211166s" podCreationTimestamp="2026-04-24 16:39:25 +0000 UTC" firstStartedPulling="2026-04-24 16:39:26.175476035 +0000 UTC m=+1.861200256" lastFinishedPulling="2026-04-24 16:39:46.726970458 +0000 UTC m=+22.412694682" observedRunningTime="2026-04-24 16:39:47.109107976 +0000 UTC m=+22.794832219" watchObservedRunningTime="2026-04-24 16:39:47.109211166 +0000 UTC m=+22.794935411" Apr 24 16:39:47.413534 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:47.413448 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-m7ftj" Apr 24 16:39:47.414059 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:47.414013 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-m7ftj" Apr 24 16:39:47.881557 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:47.881517 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:47.881557 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:47.881535 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:47.881832 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:47.881627 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:39:47.881832 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:47.881776 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:39:49.075380 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:49.075350 2581 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 16:39:49.880979 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:49.880954 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:49.881114 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:49.880961 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:49.881183 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:49.881151 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:39:49.881183 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:49.881038 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:39:50.080781 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:50.080627 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 16:39:50.081229 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:50.081169 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" event={"ID":"0048dae9-a5eb-4707-9a78-5385f148fdf1","Type":"ContainerStarted","Data":"0b0c2ec799ad0c3bf516d8a77a616aa496983e8c3752287f040afa6c1a6d826a"} Apr 24 16:39:50.081484 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:50.081466 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:50.081581 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:50.081492 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:50.081740 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:50.081724 2581 scope.go:117] "RemoveContainer" containerID="eecc8cb819c9e53fd1dfab661f8dd20128f1486beffe8b46c1a66b9c0f486a7d" Apr 24 16:39:50.099846 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:50.099820 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:51.086345 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:51.086317 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 16:39:51.086856 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:51.086690 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" event={"ID":"0048dae9-a5eb-4707-9a78-5385f148fdf1","Type":"ContainerStarted","Data":"7a3233440860a5c540d657698af15a85fe03cf00b50d19efd665e09dfe23dae2"} Apr 24 16:39:51.086919 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:51.086893 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:51.088419 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:51.088384 2581 generic.go:358] "Generic (PLEG): container finished" podID="a611eeef-0446-421a-b3c5-d38e773087f7" containerID="89bf5f967eb746df180d8c6cb6a35b0390c78e674fd3c7bbeaade19f57a79837" exitCode=0 Apr 24 16:39:51.088553 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:51.088418 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7xj6k" event={"ID":"a611eeef-0446-421a-b3c5-d38e773087f7","Type":"ContainerDied","Data":"89bf5f967eb746df180d8c6cb6a35b0390c78e674fd3c7bbeaade19f57a79837"} Apr 24 16:39:51.101330 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:51.101309 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:39:51.122953 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:51.121121 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" podStartSLOduration=8.110931166 podStartE2EDuration="26.121072772s" podCreationTimestamp="2026-04-24 16:39:25 +0000 UTC" firstStartedPulling="2026-04-24 16:39:26.148056947 +0000 UTC m=+1.833781181" lastFinishedPulling="2026-04-24 16:39:44.158198563 +0000 UTC m=+19.843922787" observedRunningTime="2026-04-24 16:39:51.120108474 +0000 UTC m=+26.805832719" watchObservedRunningTime="2026-04-24 16:39:51.121072772 +0000 UTC m=+26.806797015" Apr 24 16:39:51.639024 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:51.638990 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-m7ftj" Apr 24 16:39:51.639277 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:51.639138 2581 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 16:39:51.639725 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:51.639667 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-m7ftj" Apr 24 16:39:51.881889 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:51.881852 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:51.881889 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:51.881907 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:51.882119 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:51.882019 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:39:51.882181 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:51.882156 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:39:52.091874 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:52.091840 2581 generic.go:358] "Generic (PLEG): container finished" podID="a611eeef-0446-421a-b3c5-d38e773087f7" containerID="0db80fc3be5fbf064df3d8f7cf5267e74abb720d28eed565e627d4f201f63dad" exitCode=0 Apr 24 16:39:52.092271 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:52.091935 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7xj6k" event={"ID":"a611eeef-0446-421a-b3c5-d38e773087f7","Type":"ContainerDied","Data":"0db80fc3be5fbf064df3d8f7cf5267e74abb720d28eed565e627d4f201f63dad"} Apr 24 16:39:53.095217 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:53.095182 2581 generic.go:358] "Generic (PLEG): container finished" podID="a611eeef-0446-421a-b3c5-d38e773087f7" containerID="b3259e9dfbd9d11732f7b867be57843809787da3d9c61fe437e935dd9bc60eaf" exitCode=0 Apr 24 16:39:53.095701 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:53.095260 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7xj6k" event={"ID":"a611eeef-0446-421a-b3c5-d38e773087f7","Type":"ContainerDied","Data":"b3259e9dfbd9d11732f7b867be57843809787da3d9c61fe437e935dd9bc60eaf"} Apr 24 16:39:53.881319 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:53.881288 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:53.881319 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:53.881305 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:53.881577 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:53.881398 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:39:53.881577 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:53.881563 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:39:55.881996 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:55.881961 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:55.882462 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:55.882078 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:39:55.882462 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:55.882147 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:55.882462 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:55.882275 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:39:57.152688 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:57.152255 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q5b2h"] Apr 24 16:39:57.152688 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:57.152644 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:57.155091 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:57.152765 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:39:57.155091 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:57.154761 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9wjxs"] Apr 24 16:39:57.155091 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:57.154861 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:57.155091 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:57.154947 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:39:57.523163 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:57.523096 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs\") pod \"network-metrics-daemon-q5b2h\" (UID: \"d85b39e7-4145-4783-a50d-e94999b43e90\") " pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:57.523328 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:57.523267 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:57.523398 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:57.523350 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs podName:d85b39e7-4145-4783-a50d-e94999b43e90 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:29.523328033 +0000 UTC m=+65.209052277 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs") pod "network-metrics-daemon-q5b2h" (UID: "d85b39e7-4145-4783-a50d-e94999b43e90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:57.624039 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:57.623991 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7gbg\" (UniqueName: \"kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg\") pod \"network-check-target-9wjxs\" (UID: \"4fda4ceb-5ea7-4202-903b-a9a5b5152485\") " pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:57.624200 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:57.624177 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:57.624200 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:57.624197 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:57.624308 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:57.624210 2581 projected.go:194] Error preparing data for projected volume kube-api-access-l7gbg for pod openshift-network-diagnostics/network-check-target-9wjxs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:57.624308 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:57.624268 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg podName:4fda4ceb-5ea7-4202-903b-a9a5b5152485 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:29.624249251 +0000 UTC m=+65.309973497 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7gbg" (UniqueName: "kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg") pod "network-check-target-9wjxs" (UID: "4fda4ceb-5ea7-4202-903b-a9a5b5152485") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:58.884120 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:58.884095 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:39:58.884766 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:39:58.884093 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:39:58.884766 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:58.884242 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:39:58.884766 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:39:58.884351 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:40:00.111694 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:00.111659 2581 generic.go:358] "Generic (PLEG): container finished" podID="a611eeef-0446-421a-b3c5-d38e773087f7" containerID="bd63ede3a1d6258d68ea88f346b95d689afbde3e12c2abe95bebd0026af2acad" exitCode=0 Apr 24 16:40:00.112174 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:00.111735 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7xj6k" event={"ID":"a611eeef-0446-421a-b3c5-d38e773087f7","Type":"ContainerDied","Data":"bd63ede3a1d6258d68ea88f346b95d689afbde3e12c2abe95bebd0026af2acad"} Apr 24 16:40:00.881389 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:00.881357 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:40:00.881389 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:00.881388 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:40:00.881627 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:40:00.881522 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q5b2h" podUID="d85b39e7-4145-4783-a50d-e94999b43e90" Apr 24 16:40:00.881627 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:40:00.881569 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9wjxs" podUID="4fda4ceb-5ea7-4202-903b-a9a5b5152485" Apr 24 16:40:01.116671 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.116634 2581 generic.go:358] "Generic (PLEG): container finished" podID="a611eeef-0446-421a-b3c5-d38e773087f7" containerID="71a934580fd99180f459b85d3035314c3bdd52d960a86441034bffbec720ff9d" exitCode=0 Apr 24 16:40:01.117153 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.116685 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7xj6k" event={"ID":"a611eeef-0446-421a-b3c5-d38e773087f7","Type":"ContainerDied","Data":"71a934580fd99180f459b85d3035314c3bdd52d960a86441034bffbec720ff9d"} Apr 24 16:40:01.191873 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.191686 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-104.ec2.internal" event="NodeReady" Apr 24 16:40:01.192015 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.192001 2581 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 16:40:01.244550 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.244488 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-7smgx"] Apr 24 16:40:01.253741 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.253721 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-p2t5q"] Apr 24 16:40:01.253894 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.253879 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7smgx" Apr 24 16:40:01.256259 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.256088 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 16:40:01.256259 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.256109 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 16:40:01.256259 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.256188 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 16:40:01.256259 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.256201 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-r6qpw\"" Apr 24 16:40:01.256595 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.256417 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 16:40:01.263523 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.263474 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7smgx"] Apr 24 16:40:01.263618 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.263532 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p2t5q"] Apr 24 16:40:01.263686 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.263628 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p2t5q" Apr 24 16:40:01.265937 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.265918 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-x6pvx\"" Apr 24 16:40:01.266189 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.266175 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 16:40:01.266385 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.266372 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 16:40:01.340462 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.340425 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6nvzh"] Apr 24 16:40:01.351989 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.351954 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769-tmp-dir\") pod \"dns-default-p2t5q\" (UID: \"bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769\") " pod="openshift-dns/dns-default-p2t5q" Apr 24 16:40:01.351989 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.351998 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7bbec4d6-1e27-45e3-ae7f-fe2976a2de07-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7smgx\" (UID: \"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07\") " pod="openshift-insights/insights-runtime-extractor-7smgx" Apr 24 16:40:01.352224 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.352034 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7bbec4d6-1e27-45e3-ae7f-fe2976a2de07-data-volume\") pod \"insights-runtime-extractor-7smgx\" (UID: \"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07\") " pod="openshift-insights/insights-runtime-extractor-7smgx" Apr 24 16:40:01.352224 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.352078 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769-metrics-tls\") pod \"dns-default-p2t5q\" (UID: \"bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769\") " pod="openshift-dns/dns-default-p2t5q" Apr 24 16:40:01.352224 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.352202 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769-config-volume\") pod \"dns-default-p2t5q\" (UID: \"bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769\") " pod="openshift-dns/dns-default-p2t5q" Apr 24 16:40:01.352323 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.352233 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7bbec4d6-1e27-45e3-ae7f-fe2976a2de07-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7smgx\" (UID: \"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07\") " pod="openshift-insights/insights-runtime-extractor-7smgx" Apr 24 16:40:01.352323 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.352255 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtjb9\" (UniqueName: \"kubernetes.io/projected/bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769-kube-api-access-wtjb9\") pod \"dns-default-p2t5q\" (UID: \"bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769\") " pod="openshift-dns/dns-default-p2t5q" Apr 24 16:40:01.352323 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.352280 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml5d5\" (UniqueName: \"kubernetes.io/projected/7bbec4d6-1e27-45e3-ae7f-fe2976a2de07-kube-api-access-ml5d5\") pod \"insights-runtime-extractor-7smgx\" (UID: \"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07\") " pod="openshift-insights/insights-runtime-extractor-7smgx" Apr 24 16:40:01.352413 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.352347 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7bbec4d6-1e27-45e3-ae7f-fe2976a2de07-crio-socket\") pod \"insights-runtime-extractor-7smgx\" (UID: \"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07\") " pod="openshift-insights/insights-runtime-extractor-7smgx" Apr 24 16:40:01.359538 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.359513 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6nvzh"] Apr 24 16:40:01.359653 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.359616 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6nvzh" Apr 24 16:40:01.363730 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.363711 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 16:40:01.363851 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.363836 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zcktm\"" Apr 24 16:40:01.364132 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.364119 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 16:40:01.364189 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.364143 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 16:40:01.453579 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.453457 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769-config-volume\") pod \"dns-default-p2t5q\" (UID: \"bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769\") " pod="openshift-dns/dns-default-p2t5q" Apr 24 16:40:01.453579 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.453521 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7bbec4d6-1e27-45e3-ae7f-fe2976a2de07-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7smgx\" (UID: \"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07\") " pod="openshift-insights/insights-runtime-extractor-7smgx" Apr 24 16:40:01.453579 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.453542 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtjb9\" (UniqueName: \"kubernetes.io/projected/bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769-kube-api-access-wtjb9\") pod \"dns-default-p2t5q\" (UID: \"bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769\") " pod="openshift-dns/dns-default-p2t5q" Apr 24 16:40:01.453579 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.453561 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ml5d5\" (UniqueName: \"kubernetes.io/projected/7bbec4d6-1e27-45e3-ae7f-fe2976a2de07-kube-api-access-ml5d5\") pod \"insights-runtime-extractor-7smgx\" (UID: \"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07\") " pod="openshift-insights/insights-runtime-extractor-7smgx" Apr 24 16:40:01.453877 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.453606 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7bbec4d6-1e27-45e3-ae7f-fe2976a2de07-crio-socket\") pod \"insights-runtime-extractor-7smgx\" (UID: \"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07\") " pod="openshift-insights/insights-runtime-extractor-7smgx" Apr 24 16:40:01.453877 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.453843 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7bbec4d6-1e27-45e3-ae7f-fe2976a2de07-crio-socket\") pod \"insights-runtime-extractor-7smgx\" (UID: \"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07\") " pod="openshift-insights/insights-runtime-extractor-7smgx" Apr 24 16:40:01.453877 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.453845 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769-tmp-dir\") pod \"dns-default-p2t5q\" (UID: \"bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769\") " pod="openshift-dns/dns-default-p2t5q" Apr 24 16:40:01.453989 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.453897 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7bbec4d6-1e27-45e3-ae7f-fe2976a2de07-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7smgx\" (UID: \"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07\") " pod="openshift-insights/insights-runtime-extractor-7smgx" Apr 24 16:40:01.453989 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.453923 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7bbec4d6-1e27-45e3-ae7f-fe2976a2de07-data-volume\") pod \"insights-runtime-extractor-7smgx\" (UID: \"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07\") " pod="openshift-insights/insights-runtime-extractor-7smgx" Apr 24 16:40:01.453989 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.453950 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769-metrics-tls\") pod \"dns-default-p2t5q\" (UID: \"bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769\") " pod="openshift-dns/dns-default-p2t5q" Apr 24 16:40:01.453989 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.453981 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cc72beb-42b6-42d1-a909-660b999f008c-cert\") pod \"ingress-canary-6nvzh\" (UID: \"7cc72beb-42b6-42d1-a909-660b999f008c\") " pod="openshift-ingress-canary/ingress-canary-6nvzh" Apr 24 16:40:01.454148 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.454004 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqwp6\" (UniqueName: \"kubernetes.io/projected/7cc72beb-42b6-42d1-a909-660b999f008c-kube-api-access-kqwp6\") pod \"ingress-canary-6nvzh\" (UID: \"7cc72beb-42b6-42d1-a909-660b999f008c\") " pod="openshift-ingress-canary/ingress-canary-6nvzh" Apr 24 16:40:01.454148 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.454070 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769-tmp-dir\") pod \"dns-default-p2t5q\" (UID: \"bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769\") " pod="openshift-dns/dns-default-p2t5q" Apr 24 16:40:01.454148 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.454137 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769-config-volume\") pod \"dns-default-p2t5q\" (UID: \"bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769\") " pod="openshift-dns/dns-default-p2t5q" Apr 24 16:40:01.454246 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.454227 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7bbec4d6-1e27-45e3-ae7f-fe2976a2de07-data-volume\") pod \"insights-runtime-extractor-7smgx\" (UID: \"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07\") " pod="openshift-insights/insights-runtime-extractor-7smgx" Apr 24 16:40:01.454364 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.454347 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7bbec4d6-1e27-45e3-ae7f-fe2976a2de07-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7smgx\" (UID: \"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07\") " pod="openshift-insights/insights-runtime-extractor-7smgx" Apr 24 16:40:01.458058 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.458033 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769-metrics-tls\") pod \"dns-default-p2t5q\" (UID: \"bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769\") " pod="openshift-dns/dns-default-p2t5q" Apr 24 16:40:01.461865 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.461839 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7bbec4d6-1e27-45e3-ae7f-fe2976a2de07-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7smgx\" (UID: \"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07\") " pod="openshift-insights/insights-runtime-extractor-7smgx" Apr 24 16:40:01.468530 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.468495 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtjb9\" (UniqueName: \"kubernetes.io/projected/bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769-kube-api-access-wtjb9\") pod \"dns-default-p2t5q\" (UID: \"bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769\") " pod="openshift-dns/dns-default-p2t5q" Apr 24 16:40:01.470891 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.470868 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml5d5\" (UniqueName: \"kubernetes.io/projected/7bbec4d6-1e27-45e3-ae7f-fe2976a2de07-kube-api-access-ml5d5\") pod \"insights-runtime-extractor-7smgx\" (UID: \"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07\") " pod="openshift-insights/insights-runtime-extractor-7smgx" Apr 24 16:40:01.554994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.554957 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cc72beb-42b6-42d1-a909-660b999f008c-cert\") pod \"ingress-canary-6nvzh\" (UID: \"7cc72beb-42b6-42d1-a909-660b999f008c\") " pod="openshift-ingress-canary/ingress-canary-6nvzh" Apr 24 16:40:01.554994 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.554993 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqwp6\" (UniqueName: \"kubernetes.io/projected/7cc72beb-42b6-42d1-a909-660b999f008c-kube-api-access-kqwp6\") pod \"ingress-canary-6nvzh\" (UID: \"7cc72beb-42b6-42d1-a909-660b999f008c\") " pod="openshift-ingress-canary/ingress-canary-6nvzh" Apr 24 16:40:01.557307 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.557286 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7cc72beb-42b6-42d1-a909-660b999f008c-cert\") pod \"ingress-canary-6nvzh\" (UID: \"7cc72beb-42b6-42d1-a909-660b999f008c\") " pod="openshift-ingress-canary/ingress-canary-6nvzh" Apr 24 16:40:01.562779 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.562754 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqwp6\" (UniqueName: \"kubernetes.io/projected/7cc72beb-42b6-42d1-a909-660b999f008c-kube-api-access-kqwp6\") pod \"ingress-canary-6nvzh\" (UID: \"7cc72beb-42b6-42d1-a909-660b999f008c\") " pod="openshift-ingress-canary/ingress-canary-6nvzh" Apr 24 16:40:01.565580 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.565563 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7smgx" Apr 24 16:40:01.572250 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.572228 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p2t5q" Apr 24 16:40:01.667963 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.667892 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6nvzh" Apr 24 16:40:01.722765 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.722715 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p2t5q"] Apr 24 16:40:01.725638 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.725609 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7smgx"] Apr 24 16:40:01.735933 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:40:01.735900 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf8fd8e0_4aef_49f4_9fdb_98fb64fb7769.slice/crio-6a905c3b604f7cf4f9e550d9ec7d5c72c47a5fd7fbe08420c48c5aa6a51fa6de WatchSource:0}: Error finding container 6a905c3b604f7cf4f9e550d9ec7d5c72c47a5fd7fbe08420c48c5aa6a51fa6de: Status 404 returned error can't find the container with id 6a905c3b604f7cf4f9e550d9ec7d5c72c47a5fd7fbe08420c48c5aa6a51fa6de Apr 24 16:40:01.736269 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:40:01.736243 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bbec4d6_1e27_45e3_ae7f_fe2976a2de07.slice/crio-34418e20a28c1ffc61b38d31fb929736984c9707adcc234c96b3c3cbe7b9bfef WatchSource:0}: Error finding container 34418e20a28c1ffc61b38d31fb929736984c9707adcc234c96b3c3cbe7b9bfef: Status 404 returned error can't find the container with id 34418e20a28c1ffc61b38d31fb929736984c9707adcc234c96b3c3cbe7b9bfef Apr 24 16:40:01.799310 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.799142 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6nvzh"] Apr 24 16:40:01.802930 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:40:01.802901 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cc72beb_42b6_42d1_a909_660b999f008c.slice/crio-71a4bebe3e5ed580b22c74500ad278d3525643c0bb8bb5a0aa81d2d76550fd3a WatchSource:0}: Error finding container 71a4bebe3e5ed580b22c74500ad278d3525643c0bb8bb5a0aa81d2d76550fd3a: Status 404 returned error can't find the container with id 71a4bebe3e5ed580b22c74500ad278d3525643c0bb8bb5a0aa81d2d76550fd3a Apr 24 16:40:01.825946 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.825918 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c78b78c5c-vpc78"] Apr 24 16:40:01.836429 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.836410 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:01.837761 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.837703 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c78b78c5c-vpc78"] Apr 24 16:40:01.838889 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.838865 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 16:40:01.839056 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.838910 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 16:40:01.839056 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.839020 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-lz9x2\"" Apr 24 16:40:01.839156 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.839082 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 16:40:01.839156 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.839088 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 16:40:01.839156 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.839111 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 16:40:01.839408 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.839394 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 16:40:01.839895 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.839686 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 16:40:01.957482 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.957447 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f439dc58-a7ce-43d6-b84e-d91b40dec724-service-ca\") pod \"console-7c78b78c5c-vpc78\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:01.957650 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.957520 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f439dc58-a7ce-43d6-b84e-d91b40dec724-console-serving-cert\") pod \"console-7c78b78c5c-vpc78\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:01.957650 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.957550 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f439dc58-a7ce-43d6-b84e-d91b40dec724-oauth-serving-cert\") pod \"console-7c78b78c5c-vpc78\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:01.957650 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.957624 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-www7w\" (UniqueName: \"kubernetes.io/projected/f439dc58-a7ce-43d6-b84e-d91b40dec724-kube-api-access-www7w\") pod \"console-7c78b78c5c-vpc78\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:01.957650 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.957644 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f439dc58-a7ce-43d6-b84e-d91b40dec724-console-oauth-config\") pod \"console-7c78b78c5c-vpc78\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:01.957794 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:01.957664 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f439dc58-a7ce-43d6-b84e-d91b40dec724-console-config\") pod \"console-7c78b78c5c-vpc78\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:02.058290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.058208 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f439dc58-a7ce-43d6-b84e-d91b40dec724-console-config\") pod \"console-7c78b78c5c-vpc78\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:02.058290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.058241 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f439dc58-a7ce-43d6-b84e-d91b40dec724-service-ca\") pod \"console-7c78b78c5c-vpc78\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:02.058290 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.058282 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f439dc58-a7ce-43d6-b84e-d91b40dec724-console-serving-cert\") pod \"console-7c78b78c5c-vpc78\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:02.058497 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.058312 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f439dc58-a7ce-43d6-b84e-d91b40dec724-oauth-serving-cert\") pod \"console-7c78b78c5c-vpc78\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:02.058497 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.058377 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-www7w\" (UniqueName: \"kubernetes.io/projected/f439dc58-a7ce-43d6-b84e-d91b40dec724-kube-api-access-www7w\") pod \"console-7c78b78c5c-vpc78\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:02.058497 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.058433 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f439dc58-a7ce-43d6-b84e-d91b40dec724-console-oauth-config\") pod \"console-7c78b78c5c-vpc78\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:02.059114 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.059093 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f439dc58-a7ce-43d6-b84e-d91b40dec724-oauth-serving-cert\") pod \"console-7c78b78c5c-vpc78\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:02.059211 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.059112 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f439dc58-a7ce-43d6-b84e-d91b40dec724-console-config\") pod \"console-7c78b78c5c-vpc78\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:02.061996 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.061975 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f439dc58-a7ce-43d6-b84e-d91b40dec724-console-oauth-config\") pod \"console-7c78b78c5c-vpc78\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:02.062053 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.061984 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f439dc58-a7ce-43d6-b84e-d91b40dec724-console-serving-cert\") pod \"console-7c78b78c5c-vpc78\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:02.065459 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.065438 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-www7w\" (UniqueName: \"kubernetes.io/projected/f439dc58-a7ce-43d6-b84e-d91b40dec724-kube-api-access-www7w\") pod \"console-7c78b78c5c-vpc78\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:02.068430 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.068406 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f439dc58-a7ce-43d6-b84e-d91b40dec724-service-ca\") pod \"console-7c78b78c5c-vpc78\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:02.119448 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.119407 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6nvzh" event={"ID":"7cc72beb-42b6-42d1-a909-660b999f008c","Type":"ContainerStarted","Data":"71a4bebe3e5ed580b22c74500ad278d3525643c0bb8bb5a0aa81d2d76550fd3a"} Apr 24 16:40:02.122232 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.122206 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7xj6k" event={"ID":"a611eeef-0446-421a-b3c5-d38e773087f7","Type":"ContainerStarted","Data":"2ae8d337ea601643b672f5e0f4177429fead2f16cb930daa741911e4ca95be7d"} Apr 24 16:40:02.123059 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.123037 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p2t5q" event={"ID":"bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769","Type":"ContainerStarted","Data":"6a905c3b604f7cf4f9e550d9ec7d5c72c47a5fd7fbe08420c48c5aa6a51fa6de"} Apr 24 16:40:02.124133 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.124114 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7smgx" event={"ID":"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07","Type":"ContainerStarted","Data":"2643322476a70163b6fda908cc5252f168f0eb34cffe041e71c0ac0db8cc531d"} Apr 24 16:40:02.124209 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.124135 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7smgx" event={"ID":"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07","Type":"ContainerStarted","Data":"34418e20a28c1ffc61b38d31fb929736984c9707adcc234c96b3c3cbe7b9bfef"} Apr 24 16:40:02.145326 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.145280 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7xj6k" podStartSLOduration=4.238548806 podStartE2EDuration="37.145267514s" podCreationTimestamp="2026-04-24 16:39:25 +0000 UTC" firstStartedPulling="2026-04-24 16:39:26.135176689 +0000 UTC m=+1.820900910" lastFinishedPulling="2026-04-24 16:39:59.041895394 +0000 UTC m=+34.727619618" observedRunningTime="2026-04-24 16:40:02.144249959 +0000 UTC m=+37.829974202" watchObservedRunningTime="2026-04-24 16:40:02.145267514 +0000 UTC m=+37.830991816" Apr 24 16:40:02.146923 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.146898 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:02.275229 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.275183 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c78b78c5c-vpc78"] Apr 24 16:40:02.278474 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:40:02.278444 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf439dc58_a7ce_43d6_b84e_d91b40dec724.slice/crio-929e483bf3303fc0ed92fb8a0e3db1b86260e524e15e14bc681e723435a8bb26 WatchSource:0}: Error finding container 929e483bf3303fc0ed92fb8a0e3db1b86260e524e15e14bc681e723435a8bb26: Status 404 returned error can't find the container with id 929e483bf3303fc0ed92fb8a0e3db1b86260e524e15e14bc681e723435a8bb26 Apr 24 16:40:02.794303 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.794274 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6gp8"] Apr 24 16:40:02.817209 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.817180 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6gp8"] Apr 24 16:40:02.817363 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.817317 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6gp8" Apr 24 16:40:02.820622 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.820032 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 16:40:02.820622 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.820187 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-tld8m\"" Apr 24 16:40:02.881092 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.881064 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:40:02.881092 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.881079 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:40:02.883676 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.883651 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 16:40:02.883676 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.883664 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 16:40:02.883917 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.883703 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fcvt5\"" Apr 24 16:40:02.883917 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.883640 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 16:40:02.883917 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.883840 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-j4znz\"" Apr 24 16:40:02.967425 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:02.967388 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b55d60c2-646a-407e-9882-06fac4ac6678-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-g6gp8\" (UID: \"b55d60c2-646a-407e-9882-06fac4ac6678\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6gp8" Apr 24 16:40:03.069024 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:03.068935 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b55d60c2-646a-407e-9882-06fac4ac6678-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-g6gp8\" (UID: \"b55d60c2-646a-407e-9882-06fac4ac6678\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6gp8" Apr 24 16:40:03.069181 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:40:03.069163 2581 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 24 16:40:03.069306 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:40:03.069290 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b55d60c2-646a-407e-9882-06fac4ac6678-tls-certificates podName:b55d60c2-646a-407e-9882-06fac4ac6678 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:03.56921449 +0000 UTC m=+39.254938725 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/b55d60c2-646a-407e-9882-06fac4ac6678-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-g6gp8" (UID: "b55d60c2-646a-407e-9882-06fac4ac6678") : secret "prometheus-operator-admission-webhook-tls" not found Apr 24 16:40:03.128883 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:03.128822 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c78b78c5c-vpc78" event={"ID":"f439dc58-a7ce-43d6-b84e-d91b40dec724","Type":"ContainerStarted","Data":"929e483bf3303fc0ed92fb8a0e3db1b86260e524e15e14bc681e723435a8bb26"} Apr 24 16:40:03.573323 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:03.573281 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b55d60c2-646a-407e-9882-06fac4ac6678-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-g6gp8\" (UID: \"b55d60c2-646a-407e-9882-06fac4ac6678\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6gp8" Apr 24 16:40:03.576580 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:03.576549 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b55d60c2-646a-407e-9882-06fac4ac6678-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-g6gp8\" (UID: \"b55d60c2-646a-407e-9882-06fac4ac6678\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6gp8" Apr 24 16:40:03.741851 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:03.741808 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6gp8" Apr 24 16:40:05.752617 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:05.752560 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6gp8"] Apr 24 16:40:05.763490 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:40:05.763464 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb55d60c2_646a_407e_9882_06fac4ac6678.slice/crio-2aa7766ae0c489aff3e0a4cb4b53cb2483e9cafe24f613c183af425c46112bc3 WatchSource:0}: Error finding container 2aa7766ae0c489aff3e0a4cb4b53cb2483e9cafe24f613c183af425c46112bc3: Status 404 returned error can't find the container with id 2aa7766ae0c489aff3e0a4cb4b53cb2483e9cafe24f613c183af425c46112bc3 Apr 24 16:40:06.137231 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:06.137194 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c78b78c5c-vpc78" event={"ID":"f439dc58-a7ce-43d6-b84e-d91b40dec724","Type":"ContainerStarted","Data":"5925558218efb1e22363ced315b63f87a5e6928d6f128855fc7bbb8c409f2edb"} Apr 24 16:40:06.138924 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:06.138858 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p2t5q" event={"ID":"bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769","Type":"ContainerStarted","Data":"ad7beb9f12a8337b1871f602dfaed87281be5839ce03f4b86c831a92fddcc8b4"} Apr 24 16:40:06.138924 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:06.138886 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p2t5q" event={"ID":"bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769","Type":"ContainerStarted","Data":"154096991bcd8ce59cbabb1d8acfcdc64323dd49c7124ed5f4424c93edcabfde"} Apr 24 16:40:06.139100 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:06.138990 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-p2t5q" Apr 24 16:40:06.140403 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:06.140372 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7smgx" event={"ID":"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07","Type":"ContainerStarted","Data":"e03eeea8494352ea76843739402fe1e4daa9cfa480e27e35d3d0f3cb5761c73f"} Apr 24 16:40:06.141410 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:06.141392 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6gp8" event={"ID":"b55d60c2-646a-407e-9882-06fac4ac6678","Type":"ContainerStarted","Data":"2aa7766ae0c489aff3e0a4cb4b53cb2483e9cafe24f613c183af425c46112bc3"} Apr 24 16:40:06.142622 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:06.142602 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6nvzh" event={"ID":"7cc72beb-42b6-42d1-a909-660b999f008c","Type":"ContainerStarted","Data":"16282e28f67a5164c57d22e903e4f4c77ac090938e6e7ebdb7ec52b4348bb968"} Apr 24 16:40:06.155966 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:06.155912 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c78b78c5c-vpc78" podStartSLOduration=1.546913096 podStartE2EDuration="5.155898418s" podCreationTimestamp="2026-04-24 16:40:01 +0000 UTC" firstStartedPulling="2026-04-24 16:40:02.281076937 +0000 UTC m=+37.966801160" lastFinishedPulling="2026-04-24 16:40:05.890062244 +0000 UTC m=+41.575786482" observedRunningTime="2026-04-24 16:40:06.155096217 +0000 UTC m=+41.840820463" watchObservedRunningTime="2026-04-24 16:40:06.155898418 +0000 UTC m=+41.841622661" Apr 24 16:40:06.177239 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:06.177182 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-p2t5q" podStartSLOduration=1.295271477 podStartE2EDuration="5.177166658s" podCreationTimestamp="2026-04-24 16:40:01 +0000 UTC" firstStartedPulling="2026-04-24 16:40:01.738270781 +0000 UTC m=+37.423995002" lastFinishedPulling="2026-04-24 16:40:05.620165961 +0000 UTC m=+41.305890183" observedRunningTime="2026-04-24 16:40:06.177000361 +0000 UTC m=+41.862724631" watchObservedRunningTime="2026-04-24 16:40:06.177166658 +0000 UTC m=+41.862890902" Apr 24 16:40:06.196699 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:06.196551 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6nvzh" podStartSLOduration=1.381372329 podStartE2EDuration="5.19653413s" podCreationTimestamp="2026-04-24 16:40:01 +0000 UTC" firstStartedPulling="2026-04-24 16:40:01.804998561 +0000 UTC m=+37.490722781" lastFinishedPulling="2026-04-24 16:40:05.620160355 +0000 UTC m=+41.305884582" observedRunningTime="2026-04-24 16:40:06.195936517 +0000 UTC m=+41.881660760" watchObservedRunningTime="2026-04-24 16:40:06.19653413 +0000 UTC m=+41.882258371" Apr 24 16:40:08.149721 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.149683 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7smgx" event={"ID":"7bbec4d6-1e27-45e3-ae7f-fe2976a2de07","Type":"ContainerStarted","Data":"135f3382148c40231ad2b1475510843e770cc81c47f4c58d9b1d142ba407aa8c"} Apr 24 16:40:08.150968 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.150945 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6gp8" event={"ID":"b55d60c2-646a-407e-9882-06fac4ac6678","Type":"ContainerStarted","Data":"aa1b36ad3e9b3d7cffff614206f6403b5a5227527133209b14394b6e44aed687"} Apr 24 16:40:08.151149 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.151133 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6gp8" Apr 24 16:40:08.155830 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.155806 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6gp8" Apr 24 16:40:08.168035 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.167991 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-7smgx" podStartSLOduration=1.311891474 podStartE2EDuration="7.167978375s" podCreationTimestamp="2026-04-24 16:40:01 +0000 UTC" firstStartedPulling="2026-04-24 16:40:01.848349295 +0000 UTC m=+37.534073516" lastFinishedPulling="2026-04-24 16:40:07.704436197 +0000 UTC m=+43.390160417" observedRunningTime="2026-04-24 16:40:08.167299151 +0000 UTC m=+43.853023409" watchObservedRunningTime="2026-04-24 16:40:08.167978375 +0000 UTC m=+43.853702618" Apr 24 16:40:08.182331 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.182287 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g6gp8" podStartSLOduration=4.241828681 podStartE2EDuration="6.182275457s" podCreationTimestamp="2026-04-24 16:40:02 +0000 UTC" firstStartedPulling="2026-04-24 16:40:05.765766797 +0000 UTC m=+41.451491017" lastFinishedPulling="2026-04-24 16:40:07.706213558 +0000 UTC m=+43.391937793" observedRunningTime="2026-04-24 16:40:08.1816753 +0000 UTC m=+43.867399545" watchObservedRunningTime="2026-04-24 16:40:08.182275457 +0000 UTC m=+43.867999699" Apr 24 16:40:08.613633 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.613590 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55844dbbbc-8kmwb"] Apr 24 16:40:08.650327 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.650289 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55844dbbbc-8kmwb"] Apr 24 16:40:08.650492 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.650375 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.657775 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.657754 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 16:40:08.813716 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.813679 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-console-config\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.813884 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.813748 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdea3f22-e17b-4388-bfaf-1282b94d2b83-console-oauth-config\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.813884 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.813772 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-oauth-serving-cert\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.813884 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.813834 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-service-ca\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.813982 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.813912 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-trusted-ca-bundle\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.813982 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.813952 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdea3f22-e17b-4388-bfaf-1282b94d2b83-console-serving-cert\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.813982 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.813976 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9292v\" (UniqueName: \"kubernetes.io/projected/bdea3f22-e17b-4388-bfaf-1282b94d2b83-kube-api-access-9292v\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.869667 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.869576 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-d8xhc"] Apr 24 16:40:08.897468 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.897441 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-d8xhc" Apr 24 16:40:08.900185 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.899980 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 16:40:08.900185 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.900024 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 16:40:08.900185 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.900038 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 16:40:08.900185 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.900054 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 16:40:08.900185 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.900054 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-kjwg4\"" Apr 24 16:40:08.900561 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.900352 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 16:40:08.900826 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.900798 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-d8xhc"] Apr 24 16:40:08.914989 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.914964 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-console-config\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.915094 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.915016 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdea3f22-e17b-4388-bfaf-1282b94d2b83-console-oauth-config\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.915094 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.915041 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-oauth-serving-cert\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.915094 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.915069 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-service-ca\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.915245 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.915103 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-trusted-ca-bundle\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.915245 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.915131 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdea3f22-e17b-4388-bfaf-1282b94d2b83-console-serving-cert\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.915245 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.915179 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9292v\" (UniqueName: \"kubernetes.io/projected/bdea3f22-e17b-4388-bfaf-1282b94d2b83-kube-api-access-9292v\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.915767 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.915747 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-console-config\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.915864 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.915805 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-oauth-serving-cert\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.915963 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.915947 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-trusted-ca-bundle\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.916037 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.916018 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-service-ca\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.917401 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.917376 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdea3f22-e17b-4388-bfaf-1282b94d2b83-console-oauth-config\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.917657 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.917638 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdea3f22-e17b-4388-bfaf-1282b94d2b83-console-serving-cert\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.923817 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.923799 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9292v\" (UniqueName: \"kubernetes.io/projected/bdea3f22-e17b-4388-bfaf-1282b94d2b83-kube-api-access-9292v\") pod \"console-55844dbbbc-8kmwb\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:08.958803 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:08.958778 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:09.015805 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:09.015768 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5n8f\" (UniqueName: \"kubernetes.io/projected/cf96c251-c45d-4d4c-a5f7-928308adfb3a-kube-api-access-t5n8f\") pod \"prometheus-operator-5676c8c784-d8xhc\" (UID: \"cf96c251-c45d-4d4c-a5f7-928308adfb3a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d8xhc" Apr 24 16:40:09.015957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:09.015851 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cf96c251-c45d-4d4c-a5f7-928308adfb3a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-d8xhc\" (UID: \"cf96c251-c45d-4d4c-a5f7-928308adfb3a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d8xhc" Apr 24 16:40:09.015957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:09.015905 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cf96c251-c45d-4d4c-a5f7-928308adfb3a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-d8xhc\" (UID: \"cf96c251-c45d-4d4c-a5f7-928308adfb3a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d8xhc" Apr 24 16:40:09.016071 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:09.015993 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/cf96c251-c45d-4d4c-a5f7-928308adfb3a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-d8xhc\" (UID: \"cf96c251-c45d-4d4c-a5f7-928308adfb3a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d8xhc" Apr 24 16:40:09.078598 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:09.078565 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55844dbbbc-8kmwb"] Apr 24 16:40:09.081655 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:40:09.081629 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdea3f22_e17b_4388_bfaf_1282b94d2b83.slice/crio-868829818e56ce87aad369b391c297ddcf40ab000a0dc4eb958bbbf7b7a880c4 WatchSource:0}: Error finding container 868829818e56ce87aad369b391c297ddcf40ab000a0dc4eb958bbbf7b7a880c4: Status 404 returned error can't find the container with id 868829818e56ce87aad369b391c297ddcf40ab000a0dc4eb958bbbf7b7a880c4 Apr 24 16:40:09.116352 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:09.116311 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cf96c251-c45d-4d4c-a5f7-928308adfb3a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-d8xhc\" (UID: \"cf96c251-c45d-4d4c-a5f7-928308adfb3a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d8xhc" Apr 24 16:40:09.116523 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:09.116362 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/cf96c251-c45d-4d4c-a5f7-928308adfb3a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-d8xhc\" (UID: \"cf96c251-c45d-4d4c-a5f7-928308adfb3a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d8xhc" Apr 24 16:40:09.116523 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:09.116425 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5n8f\" (UniqueName: \"kubernetes.io/projected/cf96c251-c45d-4d4c-a5f7-928308adfb3a-kube-api-access-t5n8f\") pod \"prometheus-operator-5676c8c784-d8xhc\" (UID: \"cf96c251-c45d-4d4c-a5f7-928308adfb3a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d8xhc" Apr 24 16:40:09.116523 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:09.116486 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cf96c251-c45d-4d4c-a5f7-928308adfb3a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-d8xhc\" (UID: \"cf96c251-c45d-4d4c-a5f7-928308adfb3a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d8xhc" Apr 24 16:40:09.117028 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:09.117010 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cf96c251-c45d-4d4c-a5f7-928308adfb3a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-d8xhc\" (UID: \"cf96c251-c45d-4d4c-a5f7-928308adfb3a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d8xhc" Apr 24 16:40:09.118922 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:09.118901 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/cf96c251-c45d-4d4c-a5f7-928308adfb3a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-d8xhc\" (UID: \"cf96c251-c45d-4d4c-a5f7-928308adfb3a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d8xhc" Apr 24 16:40:09.118990 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:09.118911 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cf96c251-c45d-4d4c-a5f7-928308adfb3a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-d8xhc\" (UID: \"cf96c251-c45d-4d4c-a5f7-928308adfb3a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d8xhc" Apr 24 16:40:09.124833 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:09.124790 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5n8f\" (UniqueName: \"kubernetes.io/projected/cf96c251-c45d-4d4c-a5f7-928308adfb3a-kube-api-access-t5n8f\") pod \"prometheus-operator-5676c8c784-d8xhc\" (UID: \"cf96c251-c45d-4d4c-a5f7-928308adfb3a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-d8xhc" Apr 24 16:40:09.155253 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:09.155222 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55844dbbbc-8kmwb" event={"ID":"bdea3f22-e17b-4388-bfaf-1282b94d2b83","Type":"ContainerStarted","Data":"868829818e56ce87aad369b391c297ddcf40ab000a0dc4eb958bbbf7b7a880c4"} Apr 24 16:40:09.206291 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:09.206257 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-d8xhc" Apr 24 16:40:09.326954 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:09.326920 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-d8xhc"] Apr 24 16:40:09.330172 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:40:09.330143 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf96c251_c45d_4d4c_a5f7_928308adfb3a.slice/crio-e91f82b6dd7780f8f0c23755a54c29808c99fcf974143692f6cc36a936fe0f88 WatchSource:0}: Error finding container e91f82b6dd7780f8f0c23755a54c29808c99fcf974143692f6cc36a936fe0f88: Status 404 returned error can't find the container with id e91f82b6dd7780f8f0c23755a54c29808c99fcf974143692f6cc36a936fe0f88 Apr 24 16:40:10.160064 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:10.160024 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-d8xhc" event={"ID":"cf96c251-c45d-4d4c-a5f7-928308adfb3a","Type":"ContainerStarted","Data":"e91f82b6dd7780f8f0c23755a54c29808c99fcf974143692f6cc36a936fe0f88"} Apr 24 16:40:10.161589 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:10.161554 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55844dbbbc-8kmwb" event={"ID":"bdea3f22-e17b-4388-bfaf-1282b94d2b83","Type":"ContainerStarted","Data":"338ffe559b21e4ff6850ed5b731ed266702f183da7d06f306fc60fd3484251d5"} Apr 24 16:40:10.177817 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:10.177761 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55844dbbbc-8kmwb" podStartSLOduration=2.177747017 podStartE2EDuration="2.177747017s" podCreationTimestamp="2026-04-24 16:40:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:40:10.176909438 +0000 UTC m=+45.862633680" watchObservedRunningTime="2026-04-24 16:40:10.177747017 +0000 UTC m=+45.863471260" Apr 24 16:40:12.147548 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:12.147496 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:12.147548 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:12.147553 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:12.152394 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:12.152364 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:12.167756 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:12.167726 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-d8xhc" event={"ID":"cf96c251-c45d-4d4c-a5f7-928308adfb3a","Type":"ContainerStarted","Data":"bcc411032077d6c7c807f8bcee02144999898dedbeb2b5a89b1b77c2b6e777be"} Apr 24 16:40:12.167901 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:12.167761 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-d8xhc" event={"ID":"cf96c251-c45d-4d4c-a5f7-928308adfb3a","Type":"ContainerStarted","Data":"0a99363750334c1fd29589c54dc66c866e82be69cc2d0d7584d94e7b3d6b01ca"} Apr 24 16:40:12.172115 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:12.172091 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:12.186147 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:12.186102 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-d8xhc" podStartSLOduration=2.257922027 podStartE2EDuration="4.186090357s" podCreationTimestamp="2026-04-24 16:40:08 +0000 UTC" firstStartedPulling="2026-04-24 16:40:09.331871355 +0000 UTC m=+45.017595576" lastFinishedPulling="2026-04-24 16:40:11.260039682 +0000 UTC m=+46.945763906" observedRunningTime="2026-04-24 16:40:12.185351762 +0000 UTC m=+47.871076003" watchObservedRunningTime="2026-04-24 16:40:12.186090357 +0000 UTC m=+47.871814594" Apr 24 16:40:14.210913 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.210878 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9"] Apr 24 16:40:14.234258 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.234224 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9"] Apr 24 16:40:14.234258 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.234262 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-4kcgp"] Apr 24 16:40:14.234462 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.234270 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" Apr 24 16:40:14.236746 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.236723 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 16:40:14.236874 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.236724 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-gql2j\"" Apr 24 16:40:14.236874 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.236724 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 16:40:14.246793 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.246769 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-4kcgp"] Apr 24 16:40:14.246908 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.246887 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.248932 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.248903 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 16:40:14.249099 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.249061 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 16:40:14.249205 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.249187 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-bxcsk\"" Apr 24 16:40:14.249280 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.249266 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 16:40:14.252232 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.252208 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9c8tw"] Apr 24 16:40:14.274184 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.274155 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.276670 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.276646 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 16:40:14.276825 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.276787 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-n9mn8\"" Apr 24 16:40:14.276954 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.276936 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 16:40:14.277025 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.276956 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 16:40:14.359315 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.359277 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1d225ba-df3d-4ffa-88a0-edb91a39eb75-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lxnf9\" (UID: \"f1d225ba-df3d-4ffa-88a0-edb91a39eb75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" Apr 24 16:40:14.359498 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.359333 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f43bd50f-a7ab-4412-a611-b779465b96fa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-4kcgp\" (UID: \"f43bd50f-a7ab-4412-a611-b779465b96fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.359498 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.359357 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f43bd50f-a7ab-4412-a611-b779465b96fa-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-4kcgp\" (UID: \"f43bd50f-a7ab-4412-a611-b779465b96fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.359498 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.359381 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f43bd50f-a7ab-4412-a611-b779465b96fa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-4kcgp\" (UID: \"f43bd50f-a7ab-4412-a611-b779465b96fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.359498 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.359454 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f43bd50f-a7ab-4412-a611-b779465b96fa-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-4kcgp\" (UID: \"f43bd50f-a7ab-4412-a611-b779465b96fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.359663 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.359522 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1d225ba-df3d-4ffa-88a0-edb91a39eb75-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lxnf9\" (UID: \"f1d225ba-df3d-4ffa-88a0-edb91a39eb75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" Apr 24 16:40:14.359663 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.359566 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9vk6\" (UniqueName: \"kubernetes.io/projected/f43bd50f-a7ab-4412-a611-b779465b96fa-kube-api-access-b9vk6\") pod \"kube-state-metrics-69db897b98-4kcgp\" (UID: \"f43bd50f-a7ab-4412-a611-b779465b96fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.359663 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.359601 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f43bd50f-a7ab-4412-a611-b779465b96fa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-4kcgp\" (UID: \"f43bd50f-a7ab-4412-a611-b779465b96fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.359663 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.359627 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1d225ba-df3d-4ffa-88a0-edb91a39eb75-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lxnf9\" (UID: \"f1d225ba-df3d-4ffa-88a0-edb91a39eb75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" Apr 24 16:40:14.359663 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.359655 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkbzx\" (UniqueName: \"kubernetes.io/projected/f1d225ba-df3d-4ffa-88a0-edb91a39eb75-kube-api-access-nkbzx\") pod \"openshift-state-metrics-9d44df66c-lxnf9\" (UID: \"f1d225ba-df3d-4ffa-88a0-edb91a39eb75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" Apr 24 16:40:14.461010 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.460916 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f43bd50f-a7ab-4412-a611-b779465b96fa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-4kcgp\" (UID: \"f43bd50f-a7ab-4412-a611-b779465b96fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.461010 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.460979 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1d225ba-df3d-4ffa-88a0-edb91a39eb75-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lxnf9\" (UID: \"f1d225ba-df3d-4ffa-88a0-edb91a39eb75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" Apr 24 16:40:14.461198 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461020 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-node-exporter-tls\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.461198 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461054 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-root\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.461198 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461079 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-metrics-client-ca\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.461198 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461168 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkbzx\" (UniqueName: \"kubernetes.io/projected/f1d225ba-df3d-4ffa-88a0-edb91a39eb75-kube-api-access-nkbzx\") pod \"openshift-state-metrics-9d44df66c-lxnf9\" (UID: \"f1d225ba-df3d-4ffa-88a0-edb91a39eb75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" Apr 24 16:40:14.461198 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461194 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-sys\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.461408 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461213 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-node-exporter-accelerators-collector-config\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.461408 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461246 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1d225ba-df3d-4ffa-88a0-edb91a39eb75-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lxnf9\" (UID: \"f1d225ba-df3d-4ffa-88a0-edb91a39eb75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" Apr 24 16:40:14.461408 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461278 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.461408 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461300 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc88t\" (UniqueName: \"kubernetes.io/projected/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-kube-api-access-cc88t\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.461408 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461359 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f43bd50f-a7ab-4412-a611-b779465b96fa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-4kcgp\" (UID: \"f43bd50f-a7ab-4412-a611-b779465b96fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.461408 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461379 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f43bd50f-a7ab-4412-a611-b779465b96fa-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-4kcgp\" (UID: \"f43bd50f-a7ab-4412-a611-b779465b96fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.461408 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:40:14.461383 2581 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 24 16:40:14.461408 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461398 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f43bd50f-a7ab-4412-a611-b779465b96fa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-4kcgp\" (UID: \"f43bd50f-a7ab-4412-a611-b779465b96fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.461842 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461432 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f43bd50f-a7ab-4412-a611-b779465b96fa-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-4kcgp\" (UID: \"f43bd50f-a7ab-4412-a611-b779465b96fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.461842 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:40:14.461458 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1d225ba-df3d-4ffa-88a0-edb91a39eb75-openshift-state-metrics-tls podName:f1d225ba-df3d-4ffa-88a0-edb91a39eb75 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:14.961435813 +0000 UTC m=+50.647160049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f1d225ba-df3d-4ffa-88a0-edb91a39eb75-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-lxnf9" (UID: "f1d225ba-df3d-4ffa-88a0-edb91a39eb75") : secret "openshift-state-metrics-tls" not found Apr 24 16:40:14.461842 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461537 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-node-exporter-textfile\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.461842 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461576 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-node-exporter-wtmp\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.461842 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461625 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1d225ba-df3d-4ffa-88a0-edb91a39eb75-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lxnf9\" (UID: \"f1d225ba-df3d-4ffa-88a0-edb91a39eb75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" Apr 24 16:40:14.461842 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461675 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9vk6\" (UniqueName: \"kubernetes.io/projected/f43bd50f-a7ab-4412-a611-b779465b96fa-kube-api-access-b9vk6\") pod \"kube-state-metrics-69db897b98-4kcgp\" (UID: \"f43bd50f-a7ab-4412-a611-b779465b96fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.461842 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461754 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1d225ba-df3d-4ffa-88a0-edb91a39eb75-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lxnf9\" (UID: \"f1d225ba-df3d-4ffa-88a0-edb91a39eb75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" Apr 24 16:40:14.461842 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461815 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f43bd50f-a7ab-4412-a611-b779465b96fa-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-4kcgp\" (UID: \"f43bd50f-a7ab-4412-a611-b779465b96fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.462251 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.461805 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f43bd50f-a7ab-4412-a611-b779465b96fa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-4kcgp\" (UID: \"f43bd50f-a7ab-4412-a611-b779465b96fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.462328 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.462302 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f43bd50f-a7ab-4412-a611-b779465b96fa-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-4kcgp\" (UID: \"f43bd50f-a7ab-4412-a611-b779465b96fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.464021 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.463991 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1d225ba-df3d-4ffa-88a0-edb91a39eb75-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lxnf9\" (UID: \"f1d225ba-df3d-4ffa-88a0-edb91a39eb75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" Apr 24 16:40:14.464021 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.464015 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f43bd50f-a7ab-4412-a611-b779465b96fa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-4kcgp\" (UID: \"f43bd50f-a7ab-4412-a611-b779465b96fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.464173 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.464109 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f43bd50f-a7ab-4412-a611-b779465b96fa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-4kcgp\" (UID: \"f43bd50f-a7ab-4412-a611-b779465b96fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.469933 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.469908 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkbzx\" (UniqueName: \"kubernetes.io/projected/f1d225ba-df3d-4ffa-88a0-edb91a39eb75-kube-api-access-nkbzx\") pod \"openshift-state-metrics-9d44df66c-lxnf9\" (UID: \"f1d225ba-df3d-4ffa-88a0-edb91a39eb75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" Apr 24 16:40:14.470109 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.470092 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9vk6\" (UniqueName: \"kubernetes.io/projected/f43bd50f-a7ab-4412-a611-b779465b96fa-kube-api-access-b9vk6\") pod \"kube-state-metrics-69db897b98-4kcgp\" (UID: \"f43bd50f-a7ab-4412-a611-b779465b96fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.555786 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.555754 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" Apr 24 16:40:14.562803 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.562768 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-node-exporter-tls\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.562931 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.562820 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-root\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.562931 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.562883 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-root\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.562931 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:40:14.562910 2581 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 16:40:14.563105 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:40:14.562972 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-node-exporter-tls podName:8e4128aa-6993-4ee6-a68e-8f79e6b7bece nodeName:}" failed. No retries permitted until 2026-04-24 16:40:15.062952138 +0000 UTC m=+50.748676376 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-node-exporter-tls") pod "node-exporter-9c8tw" (UID: "8e4128aa-6993-4ee6-a68e-8f79e6b7bece") : secret "node-exporter-tls" not found Apr 24 16:40:14.563105 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.562992 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-metrics-client-ca\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.563105 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.563027 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-sys\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.563105 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.563053 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-node-exporter-accelerators-collector-config\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.563318 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.563105 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.563318 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.563128 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-sys\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.563746 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.563136 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cc88t\" (UniqueName: \"kubernetes.io/projected/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-kube-api-access-cc88t\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.563959 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.563599 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-metrics-client-ca\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.564084 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.564027 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-node-exporter-accelerators-collector-config\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.564298 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.564032 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-node-exporter-textfile\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.564412 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.564351 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-node-exporter-wtmp\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.564546 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.564527 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-node-exporter-wtmp\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.565709 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.565678 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.576865 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.576835 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-node-exporter-textfile\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.578908 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.578876 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc88t\" (UniqueName: \"kubernetes.io/projected/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-kube-api-access-cc88t\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:14.681702 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.681667 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-4kcgp"] Apr 24 16:40:14.689925 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:40:14.689892 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf43bd50f_a7ab_4412_a611_b779465b96fa.slice/crio-8b1d7eb83627426f20589f216881a44e10ed97a788bf3165f59ed7b261a8c304 WatchSource:0}: Error finding container 8b1d7eb83627426f20589f216881a44e10ed97a788bf3165f59ed7b261a8c304: Status 404 returned error can't find the container with id 8b1d7eb83627426f20589f216881a44e10ed97a788bf3165f59ed7b261a8c304 Apr 24 16:40:14.967351 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:14.967315 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1d225ba-df3d-4ffa-88a0-edb91a39eb75-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lxnf9\" (UID: \"f1d225ba-df3d-4ffa-88a0-edb91a39eb75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" Apr 24 16:40:14.967542 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:40:14.967437 2581 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 24 16:40:14.967542 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:40:14.967490 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1d225ba-df3d-4ffa-88a0-edb91a39eb75-openshift-state-metrics-tls podName:f1d225ba-df3d-4ffa-88a0-edb91a39eb75 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:15.967476589 +0000 UTC m=+51.653200810 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f1d225ba-df3d-4ffa-88a0-edb91a39eb75-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-lxnf9" (UID: "f1d225ba-df3d-4ffa-88a0-edb91a39eb75") : secret "openshift-state-metrics-tls" not found Apr 24 16:40:15.068296 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.068252 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-node-exporter-tls\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:15.070668 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.070642 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8e4128aa-6993-4ee6-a68e-8f79e6b7bece-node-exporter-tls\") pod \"node-exporter-9c8tw\" (UID: \"8e4128aa-6993-4ee6-a68e-8f79e6b7bece\") " pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:15.177146 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.177096 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" event={"ID":"f43bd50f-a7ab-4412-a611-b779465b96fa","Type":"ContainerStarted","Data":"8b1d7eb83627426f20589f216881a44e10ed97a788bf3165f59ed7b261a8c304"} Apr 24 16:40:15.183342 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.183306 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9c8tw" Apr 24 16:40:15.191601 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:40:15.191572 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e4128aa_6993_4ee6_a68e_8f79e6b7bece.slice/crio-1db773188c7ada0e429e13b8c9782ff1a7c4b899f8b6fd31e362e513c8e98580 WatchSource:0}: Error finding container 1db773188c7ada0e429e13b8c9782ff1a7c4b899f8b6fd31e362e513c8e98580: Status 404 returned error can't find the container with id 1db773188c7ada0e429e13b8c9782ff1a7c4b899f8b6fd31e362e513c8e98580 Apr 24 16:40:15.306481 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.306401 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 16:40:15.325438 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.325412 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.330528 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.330116 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 16:40:15.330528 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.330338 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 16:40:15.331391 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.331229 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 16:40:15.332007 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.331980 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 16:40:15.332112 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.332045 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 16:40:15.332259 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.332241 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 16:40:15.332317 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.332296 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 16:40:15.335236 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.335216 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 16:40:15.336158 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.336128 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-68rp2\"" Apr 24 16:40:15.339558 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.339539 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 16:40:15.340538 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.340518 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 16:40:15.370856 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.370816 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.370856 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.370855 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqd8s\" (UniqueName: \"kubernetes.io/projected/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-kube-api-access-sqd8s\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.371052 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.370938 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.371052 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.371008 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-config-out\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.371052 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.371037 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.371206 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.371110 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.371206 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.371146 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.371206 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.371191 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-config-volume\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.371324 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.371272 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.371324 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.371302 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.371424 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.371326 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.371424 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.371358 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.371424 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.371411 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-web-config\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.471976 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.471945 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.471976 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.471984 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqd8s\" (UniqueName: \"kubernetes.io/projected/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-kube-api-access-sqd8s\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.472322 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.472282 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.472450 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.472342 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-config-out\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.472450 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.472376 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.472450 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.472423 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.472631 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.472448 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.472631 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.472494 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-config-volume\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.472631 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.472567 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.472631 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.472596 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.472631 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.472621 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.472856 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.472655 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.472856 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.472685 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-web-config\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.473208 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.473174 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.474269 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.473854 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.474269 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:40:15.473970 2581 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 24 16:40:15.474269 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:40:15.474039 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-secret-alertmanager-main-tls podName:90bb1fd8-5247-4ab8-b24f-09042b28a1d1 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:15.974020896 +0000 UTC m=+51.659745136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "90bb1fd8-5247-4ab8-b24f-09042b28a1d1") : secret "alertmanager-main-tls" not found Apr 24 16:40:15.474269 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.474155 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.475141 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.475095 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.475628 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.475601 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-config-out\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.475747 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.475687 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.477088 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.477035 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-web-config\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.477088 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.477048 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.477435 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.477398 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.478331 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.478307 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-tls-assets\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.478691 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.478667 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-config-volume\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.484408 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.484385 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqd8s\" (UniqueName: \"kubernetes.io/projected/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-kube-api-access-sqd8s\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.977755 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.977703 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.977755 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.977756 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1d225ba-df3d-4ffa-88a0-edb91a39eb75-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lxnf9\" (UID: \"f1d225ba-df3d-4ffa-88a0-edb91a39eb75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" Apr 24 16:40:15.980446 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.980416 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/90bb1fd8-5247-4ab8-b24f-09042b28a1d1-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"90bb1fd8-5247-4ab8-b24f-09042b28a1d1\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:15.980595 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:15.980529 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1d225ba-df3d-4ffa-88a0-edb91a39eb75-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lxnf9\" (UID: \"f1d225ba-df3d-4ffa-88a0-edb91a39eb75\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" Apr 24 16:40:16.044354 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.044316 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" Apr 24 16:40:16.148421 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.148388 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-p2t5q" Apr 24 16:40:16.183071 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.183020 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9c8tw" event={"ID":"8e4128aa-6993-4ee6-a68e-8f79e6b7bece","Type":"ContainerStarted","Data":"1db773188c7ada0e429e13b8c9782ff1a7c4b899f8b6fd31e362e513c8e98580"} Apr 24 16:40:16.238726 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.238658 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:40:16.309234 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.309184 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-68b58d564c-vmd2m"] Apr 24 16:40:16.313935 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.313913 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.316427 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.316404 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 16:40:16.316631 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.316606 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 16:40:16.316726 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.316666 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 16:40:16.316784 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.316747 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-tfhxj\"" Apr 24 16:40:16.316957 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.316935 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 16:40:16.317271 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.317244 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 16:40:16.317632 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.317456 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-5cmpa1p47fc62\"" Apr 24 16:40:16.326705 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.325333 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-68b58d564c-vmd2m"] Apr 24 16:40:16.381855 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.381639 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/425a7976-5d3c-47ca-9644-a15892195c45-metrics-client-ca\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.382303 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.382022 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/425a7976-5d3c-47ca-9644-a15892195c45-secret-grpc-tls\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.382303 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.382073 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/425a7976-5d3c-47ca-9644-a15892195c45-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.382303 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.382115 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/425a7976-5d3c-47ca-9644-a15892195c45-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.382303 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.382148 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq25s\" (UniqueName: \"kubernetes.io/projected/425a7976-5d3c-47ca-9644-a15892195c45-kube-api-access-fq25s\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.382303 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.382245 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/425a7976-5d3c-47ca-9644-a15892195c45-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.382576 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.382340 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/425a7976-5d3c-47ca-9644-a15892195c45-secret-thanos-querier-tls\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.382576 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.382408 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/425a7976-5d3c-47ca-9644-a15892195c45-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.434395 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.434345 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9"] Apr 24 16:40:16.440724 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:40:16.440495 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d225ba_df3d_4ffa_88a0_edb91a39eb75.slice/crio-32d107e86ec826056f09191c518363c19e6bd77b3ac6df2b4ef5cb75a0e20fa6 WatchSource:0}: Error finding container 32d107e86ec826056f09191c518363c19e6bd77b3ac6df2b4ef5cb75a0e20fa6: Status 404 returned error can't find the container with id 32d107e86ec826056f09191c518363c19e6bd77b3ac6df2b4ef5cb75a0e20fa6 Apr 24 16:40:16.466893 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.466839 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 16:40:16.473562 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:40:16.473308 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90bb1fd8_5247_4ab8_b24f_09042b28a1d1.slice/crio-5bad4031c4f784f00faed32fcb506962d993394153f4b21d68a07b3f3df3bf69 WatchSource:0}: Error finding container 5bad4031c4f784f00faed32fcb506962d993394153f4b21d68a07b3f3df3bf69: Status 404 returned error can't find the container with id 5bad4031c4f784f00faed32fcb506962d993394153f4b21d68a07b3f3df3bf69 Apr 24 16:40:16.484000 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.483956 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/425a7976-5d3c-47ca-9644-a15892195c45-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.484237 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.484198 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/425a7976-5d3c-47ca-9644-a15892195c45-metrics-client-ca\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.484681 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.484625 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/425a7976-5d3c-47ca-9644-a15892195c45-secret-grpc-tls\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.484923 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.484860 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/425a7976-5d3c-47ca-9644-a15892195c45-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.485244 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.485209 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/425a7976-5d3c-47ca-9644-a15892195c45-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.485482 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.485439 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fq25s\" (UniqueName: \"kubernetes.io/projected/425a7976-5d3c-47ca-9644-a15892195c45-kube-api-access-fq25s\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.485953 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.485752 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/425a7976-5d3c-47ca-9644-a15892195c45-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.485953 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.485841 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/425a7976-5d3c-47ca-9644-a15892195c45-secret-thanos-querier-tls\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.487051 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.486789 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/425a7976-5d3c-47ca-9644-a15892195c45-metrics-client-ca\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.490126 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.490051 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/425a7976-5d3c-47ca-9644-a15892195c45-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.490249 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.490227 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/425a7976-5d3c-47ca-9644-a15892195c45-secret-thanos-querier-tls\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.492732 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.492702 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/425a7976-5d3c-47ca-9644-a15892195c45-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.494286 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.493132 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/425a7976-5d3c-47ca-9644-a15892195c45-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.494286 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.494229 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/425a7976-5d3c-47ca-9644-a15892195c45-secret-grpc-tls\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.494791 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.494651 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/425a7976-5d3c-47ca-9644-a15892195c45-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.502632 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.502578 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq25s\" (UniqueName: \"kubernetes.io/projected/425a7976-5d3c-47ca-9644-a15892195c45-kube-api-access-fq25s\") pod \"thanos-querier-68b58d564c-vmd2m\" (UID: \"425a7976-5d3c-47ca-9644-a15892195c45\") " pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.632176 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.632143 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:16.772906 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:16.772823 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-68b58d564c-vmd2m"] Apr 24 16:40:16.776018 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:40:16.775988 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod425a7976_5d3c_47ca_9644_a15892195c45.slice/crio-a46b320d446fc10e62511ba14bc26b5b13085bda63dfeb337b2bdf656a2b17e9 WatchSource:0}: Error finding container a46b320d446fc10e62511ba14bc26b5b13085bda63dfeb337b2bdf656a2b17e9: Status 404 returned error can't find the container with id a46b320d446fc10e62511ba14bc26b5b13085bda63dfeb337b2bdf656a2b17e9 Apr 24 16:40:17.188991 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:17.188846 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" event={"ID":"f43bd50f-a7ab-4412-a611-b779465b96fa","Type":"ContainerStarted","Data":"2e29881bd8cf790b9101f969a1126b1811294d1742256d77ff615379860f3369"} Apr 24 16:40:17.188991 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:17.188894 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" event={"ID":"f43bd50f-a7ab-4412-a611-b779465b96fa","Type":"ContainerStarted","Data":"c1b49f8c7ffff13b17e5b25e3395279bd709295910044d85c48a5ec82b0da9a5"} Apr 24 16:40:17.188991 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:17.188909 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" event={"ID":"f43bd50f-a7ab-4412-a611-b779465b96fa","Type":"ContainerStarted","Data":"43605cdc28888fad33d7dda3263c8426158ffbab449be311c4e9c4a15fbbf50f"} Apr 24 16:40:17.191161 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:17.191024 2581 generic.go:358] "Generic (PLEG): container finished" podID="8e4128aa-6993-4ee6-a68e-8f79e6b7bece" containerID="875c6f22b6b1a32a91cbca66f47b56d4192c08cbfdeffdfd2e3bf3088ee369c8" exitCode=0 Apr 24 16:40:17.191161 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:17.191112 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9c8tw" event={"ID":"8e4128aa-6993-4ee6-a68e-8f79e6b7bece","Type":"ContainerDied","Data":"875c6f22b6b1a32a91cbca66f47b56d4192c08cbfdeffdfd2e3bf3088ee369c8"} Apr 24 16:40:17.192748 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:17.192720 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" event={"ID":"425a7976-5d3c-47ca-9644-a15892195c45","Type":"ContainerStarted","Data":"a46b320d446fc10e62511ba14bc26b5b13085bda63dfeb337b2bdf656a2b17e9"} Apr 24 16:40:17.194626 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:17.194589 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"90bb1fd8-5247-4ab8-b24f-09042b28a1d1","Type":"ContainerStarted","Data":"5bad4031c4f784f00faed32fcb506962d993394153f4b21d68a07b3f3df3bf69"} Apr 24 16:40:17.197115 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:17.197083 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" event={"ID":"f1d225ba-df3d-4ffa-88a0-edb91a39eb75","Type":"ContainerStarted","Data":"d7b9285c367f2f14a37da2ebd04777324558afc76ec5eb7c65cb1dbad0c56cf0"} Apr 24 16:40:17.197115 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:17.197114 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" event={"ID":"f1d225ba-df3d-4ffa-88a0-edb91a39eb75","Type":"ContainerStarted","Data":"cefdf08592aa3d8683aacdafc29c0afe9325ebf15db194c00b241c317ef86a69"} Apr 24 16:40:17.197271 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:17.197129 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" event={"ID":"f1d225ba-df3d-4ffa-88a0-edb91a39eb75","Type":"ContainerStarted","Data":"32d107e86ec826056f09191c518363c19e6bd77b3ac6df2b4ef5cb75a0e20fa6"} Apr 24 16:40:17.211863 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:17.211753 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-4kcgp" podStartSLOduration=1.60776378 podStartE2EDuration="3.211734054s" podCreationTimestamp="2026-04-24 16:40:14 +0000 UTC" firstStartedPulling="2026-04-24 16:40:14.692053808 +0000 UTC m=+50.377778028" lastFinishedPulling="2026-04-24 16:40:16.29602407 +0000 UTC m=+51.981748302" observedRunningTime="2026-04-24 16:40:17.210911018 +0000 UTC m=+52.896635298" watchObservedRunningTime="2026-04-24 16:40:17.211734054 +0000 UTC m=+52.897458297" Apr 24 16:40:18.201772 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.201674 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9c8tw" event={"ID":"8e4128aa-6993-4ee6-a68e-8f79e6b7bece","Type":"ContainerStarted","Data":"3dc5e6841c893bd7dad904fcaeb110309b21db48a3eaec908fdb2e4c0cbe95d0"} Apr 24 16:40:18.201772 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.201714 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9c8tw" event={"ID":"8e4128aa-6993-4ee6-a68e-8f79e6b7bece","Type":"ContainerStarted","Data":"f8cbe32b31fbd5a2d10fd55cecaf54edd23aa8a7ed1539a353fd20b2b13fcec4"} Apr 24 16:40:18.203170 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.203140 2581 generic.go:358] "Generic (PLEG): container finished" podID="90bb1fd8-5247-4ab8-b24f-09042b28a1d1" containerID="b6c4d6f021e7544ba279c3609d4c71acc54b837b267089b671de5f4bc4408846" exitCode=0 Apr 24 16:40:18.203325 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.203226 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"90bb1fd8-5247-4ab8-b24f-09042b28a1d1","Type":"ContainerDied","Data":"b6c4d6f021e7544ba279c3609d4c71acc54b837b267089b671de5f4bc4408846"} Apr 24 16:40:18.205231 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.205193 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" event={"ID":"f1d225ba-df3d-4ffa-88a0-edb91a39eb75","Type":"ContainerStarted","Data":"cffd5e44767afb62b7cd2dea9c022c2515ecc0b365dec55fd4b59dfd2d6d406f"} Apr 24 16:40:18.221959 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.221914 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9c8tw" podStartSLOduration=3.119053065 podStartE2EDuration="4.221901555s" podCreationTimestamp="2026-04-24 16:40:14 +0000 UTC" firstStartedPulling="2026-04-24 16:40:15.193173795 +0000 UTC m=+50.878898016" lastFinishedPulling="2026-04-24 16:40:16.296022273 +0000 UTC m=+51.981746506" observedRunningTime="2026-04-24 16:40:18.220205538 +0000 UTC m=+53.905929780" watchObservedRunningTime="2026-04-24 16:40:18.221901555 +0000 UTC m=+53.907625798" Apr 24 16:40:18.238656 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.238609 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lxnf9" podStartSLOduration=2.944826534 podStartE2EDuration="4.238594563s" podCreationTimestamp="2026-04-24 16:40:14 +0000 UTC" firstStartedPulling="2026-04-24 16:40:16.659195482 +0000 UTC m=+52.344919706" lastFinishedPulling="2026-04-24 16:40:17.952963499 +0000 UTC m=+53.638687735" observedRunningTime="2026-04-24 16:40:18.237526225 +0000 UTC m=+53.923250470" watchObservedRunningTime="2026-04-24 16:40:18.238594563 +0000 UTC m=+53.924318806" Apr 24 16:40:18.534039 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.533999 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-67859b89cc-28g4s"] Apr 24 16:40:18.538648 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.538621 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.540833 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.540805 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 24 16:40:18.541064 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.541038 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 16:40:18.541064 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.541069 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 24 16:40:18.541236 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.541208 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-75b6ik0qdpboj\"" Apr 24 16:40:18.541394 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.541363 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 24 16:40:18.541484 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.541397 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-fvx7x\"" Apr 24 16:40:18.547884 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.547857 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-67859b89cc-28g4s"] Apr 24 16:40:18.607450 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.607411 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnrk6\" (UniqueName: \"kubernetes.io/projected/0e9cacec-d9bc-49d8-809b-54a40842359c-kube-api-access-bnrk6\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.607630 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.607468 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e9cacec-d9bc-49d8-809b-54a40842359c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.607630 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.607533 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/0e9cacec-d9bc-49d8-809b-54a40842359c-secret-metrics-server-client-certs\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.607630 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.607583 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9cacec-d9bc-49d8-809b-54a40842359c-client-ca-bundle\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.607630 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.607617 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0e9cacec-d9bc-49d8-809b-54a40842359c-audit-log\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.607842 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.607734 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0e9cacec-d9bc-49d8-809b-54a40842359c-secret-metrics-server-tls\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.607842 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.607787 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0e9cacec-d9bc-49d8-809b-54a40842359c-metrics-server-audit-profiles\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.708965 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.708931 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0e9cacec-d9bc-49d8-809b-54a40842359c-secret-metrics-server-tls\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.709175 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.708984 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0e9cacec-d9bc-49d8-809b-54a40842359c-metrics-server-audit-profiles\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.709175 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.709033 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnrk6\" (UniqueName: \"kubernetes.io/projected/0e9cacec-d9bc-49d8-809b-54a40842359c-kube-api-access-bnrk6\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.709175 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.709065 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e9cacec-d9bc-49d8-809b-54a40842359c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.709175 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.709111 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/0e9cacec-d9bc-49d8-809b-54a40842359c-secret-metrics-server-client-certs\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.709175 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.709175 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9cacec-d9bc-49d8-809b-54a40842359c-client-ca-bundle\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.709438 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.709195 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0e9cacec-d9bc-49d8-809b-54a40842359c-audit-log\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.709665 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.709635 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/0e9cacec-d9bc-49d8-809b-54a40842359c-audit-log\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.710040 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.710015 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e9cacec-d9bc-49d8-809b-54a40842359c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.710364 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.710315 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/0e9cacec-d9bc-49d8-809b-54a40842359c-metrics-server-audit-profiles\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.712212 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.712188 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/0e9cacec-d9bc-49d8-809b-54a40842359c-secret-metrics-server-client-certs\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.712331 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.712311 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/0e9cacec-d9bc-49d8-809b-54a40842359c-secret-metrics-server-tls\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.712331 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.712325 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e9cacec-d9bc-49d8-809b-54a40842359c-client-ca-bundle\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.718262 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.718239 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnrk6\" (UniqueName: \"kubernetes.io/projected/0e9cacec-d9bc-49d8-809b-54a40842359c-kube-api-access-bnrk6\") pod \"metrics-server-67859b89cc-28g4s\" (UID: \"0e9cacec-d9bc-49d8-809b-54a40842359c\") " pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.850437 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.850410 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:18.959323 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.958935 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:18.959564 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.959490 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:18.964832 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:18.964728 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:19.008154 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:19.008122 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-67859b89cc-28g4s"] Apr 24 16:40:19.090206 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:19.090102 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55844dbbbc-8kmwb"] Apr 24 16:40:19.211366 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:19.211312 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" event={"ID":"0e9cacec-d9bc-49d8-809b-54a40842359c","Type":"ContainerStarted","Data":"ffe4ab61cf9a9f078b3b943b62a26ddbb12fd41f48859e5182033d332bd5fab7"} Apr 24 16:40:19.214178 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:19.214146 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" event={"ID":"425a7976-5d3c-47ca-9644-a15892195c45","Type":"ContainerStarted","Data":"032769ab7b8a73c2db9de1ed34d1ee452fcaf10ef3d3a6de695a65ad9a406344"} Apr 24 16:40:19.214328 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:19.214184 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" event={"ID":"425a7976-5d3c-47ca-9644-a15892195c45","Type":"ContainerStarted","Data":"dfb174247f787aadd48076eec83c66e41a9507d02f24d13c32d670bbd445be03"} Apr 24 16:40:19.214328 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:19.214199 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" event={"ID":"425a7976-5d3c-47ca-9644-a15892195c45","Type":"ContainerStarted","Data":"e85c66c2d6ff9a51c4ab1102869c4f801307e7447371fd89f97dea3538a67838"} Apr 24 16:40:19.220759 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:19.220737 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:20.220090 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.220055 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" event={"ID":"425a7976-5d3c-47ca-9644-a15892195c45","Type":"ContainerStarted","Data":"1b74e0a3db974025f3b7fea90d6029d0ba25bdbc47f62b993f9df07289f6c186"} Apr 24 16:40:20.222080 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.222050 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"90bb1fd8-5247-4ab8-b24f-09042b28a1d1","Type":"ContainerStarted","Data":"df78bbdbc908ecfab8ddcf8eab81429950c0402e61ab70d722746bb1b763eef8"} Apr 24 16:40:20.478367 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.478327 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 16:40:20.484429 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.484396 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.486805 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.486667 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-slbhj\"" Apr 24 16:40:20.486805 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.486722 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 16:40:20.486805 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.486671 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 16:40:20.486805 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.486771 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 16:40:20.487454 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.487434 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 16:40:20.487588 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.487461 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 16:40:20.487588 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.487518 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 16:40:20.487588 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.487542 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 16:40:20.487825 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.487810 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 16:40:20.487902 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.487855 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 16:40:20.487902 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.487867 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-a7kgoo2vhv932\"" Apr 24 16:40:20.487902 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.487860 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 16:40:20.491691 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.491662 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 16:40:20.497206 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.497157 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 16:40:20.500116 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.500086 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 16:40:20.528555 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.528453 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f5274be0-cad9-4355-bcd9-8ca6089639d0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.528555 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.528519 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f5274be0-cad9-4355-bcd9-8ca6089639d0-config-out\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.528555 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.528552 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.528847 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.528641 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.528847 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.528676 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.528847 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.528708 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f5274be0-cad9-4355-bcd9-8ca6089639d0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.528847 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.528748 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.528847 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.528768 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5274be0-cad9-4355-bcd9-8ca6089639d0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.528847 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.528785 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-web-config\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.528847 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.528808 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5274be0-cad9-4355-bcd9-8ca6089639d0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.528847 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.528846 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.529265 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.528872 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f5274be0-cad9-4355-bcd9-8ca6089639d0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.529265 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.528902 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5274be0-cad9-4355-bcd9-8ca6089639d0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.529265 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.528946 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-config\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.529265 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.528971 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f5274be0-cad9-4355-bcd9-8ca6089639d0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.529265 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.529078 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.529265 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.529154 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.529265 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.529183 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnps6\" (UniqueName: \"kubernetes.io/projected/f5274be0-cad9-4355-bcd9-8ca6089639d0-kube-api-access-lnps6\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.629616 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.629577 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f5274be0-cad9-4355-bcd9-8ca6089639d0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.629731 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.629637 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f5274be0-cad9-4355-bcd9-8ca6089639d0-config-out\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.629731 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.629668 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.629731 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.629716 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.629890 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.629742 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.629890 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.629781 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f5274be0-cad9-4355-bcd9-8ca6089639d0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.629890 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.629837 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.629890 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.629866 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5274be0-cad9-4355-bcd9-8ca6089639d0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.630080 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.629896 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-web-config\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.630080 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.629927 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5274be0-cad9-4355-bcd9-8ca6089639d0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.630080 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.629966 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.630080 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.629993 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f5274be0-cad9-4355-bcd9-8ca6089639d0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.630080 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.630026 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5274be0-cad9-4355-bcd9-8ca6089639d0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.630080 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.630071 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-config\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.630494 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.630096 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f5274be0-cad9-4355-bcd9-8ca6089639d0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.630494 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.630132 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.630494 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.630166 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.630494 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.630192 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnps6\" (UniqueName: \"kubernetes.io/projected/f5274be0-cad9-4355-bcd9-8ca6089639d0-kube-api-access-lnps6\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.631012 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.630915 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5274be0-cad9-4355-bcd9-8ca6089639d0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.632702 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.631922 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f5274be0-cad9-4355-bcd9-8ca6089639d0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.632702 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.632095 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5274be0-cad9-4355-bcd9-8ca6089639d0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.633588 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.633409 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f5274be0-cad9-4355-bcd9-8ca6089639d0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.634519 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.634233 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f5274be0-cad9-4355-bcd9-8ca6089639d0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.635630 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.635600 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5274be0-cad9-4355-bcd9-8ca6089639d0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.637013 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.636969 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.638811 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.638777 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.639308 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.639284 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.639437 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.639414 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.639618 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.639523 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.639837 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.639817 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-web-config\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.640202 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.640184 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.640592 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.640280 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-config\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.640680 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.640615 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f5274be0-cad9-4355-bcd9-8ca6089639d0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.640871 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.640846 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f5274be0-cad9-4355-bcd9-8ca6089639d0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.642308 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.642288 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnps6\" (UniqueName: \"kubernetes.io/projected/f5274be0-cad9-4355-bcd9-8ca6089639d0-kube-api-access-lnps6\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.642880 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.642854 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f5274be0-cad9-4355-bcd9-8ca6089639d0-config-out\") pod \"prometheus-k8s-0\" (UID: \"f5274be0-cad9-4355-bcd9-8ca6089639d0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.799951 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.799869 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:20.960763 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:20.960726 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 16:40:20.967704 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:40:20.967678 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5274be0_cad9_4355_bcd9_8ca6089639d0.slice/crio-9b545153c36ebf9dbf75c686389300d4a9334ee106007c0bf6f38e0f571ac163 WatchSource:0}: Error finding container 9b545153c36ebf9dbf75c686389300d4a9334ee106007c0bf6f38e0f571ac163: Status 404 returned error can't find the container with id 9b545153c36ebf9dbf75c686389300d4a9334ee106007c0bf6f38e0f571ac163 Apr 24 16:40:21.226720 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:21.226680 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" event={"ID":"0e9cacec-d9bc-49d8-809b-54a40842359c","Type":"ContainerStarted","Data":"d80e91298c5639a12ed61a2442a67b296e3b974e514ef4d203dd1cf82515acb3"} Apr 24 16:40:21.229283 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:21.229257 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" event={"ID":"425a7976-5d3c-47ca-9644-a15892195c45","Type":"ContainerStarted","Data":"3fc055b5463510ca30dc7dbee386c4a0dd84660a26f8e610ec820b55286e62d9"} Apr 24 16:40:21.229414 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:21.229288 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" event={"ID":"425a7976-5d3c-47ca-9644-a15892195c45","Type":"ContainerStarted","Data":"d0f4598584f5d0ecb563313bfe55344975699923c2f33bb92a526ca7a2183260"} Apr 24 16:40:21.229477 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:21.229439 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:21.234676 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:21.234612 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"90bb1fd8-5247-4ab8-b24f-09042b28a1d1","Type":"ContainerStarted","Data":"0001725984c236cecd70384a6ebec217d58f358fdbe509ae1b8c5d96ec4694d6"} Apr 24 16:40:21.234676 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:21.234642 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"90bb1fd8-5247-4ab8-b24f-09042b28a1d1","Type":"ContainerStarted","Data":"e53d7136a371014a97afe7ab581f361529a67b489ef87c8f74e751bae95ffa5e"} Apr 24 16:40:21.234676 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:21.234655 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"90bb1fd8-5247-4ab8-b24f-09042b28a1d1","Type":"ContainerStarted","Data":"5c7acf87b14cb86c689e1adef9c9c63eb445b5d8318068f2273438d54b458597"} Apr 24 16:40:21.234676 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:21.234663 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"90bb1fd8-5247-4ab8-b24f-09042b28a1d1","Type":"ContainerStarted","Data":"ad047cc7354694063feea2b39577e2b98d578bbb6b78e56c40ac44d197528d1b"} Apr 24 16:40:21.234676 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:21.234673 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"90bb1fd8-5247-4ab8-b24f-09042b28a1d1","Type":"ContainerStarted","Data":"d0e076440a9a9d61335775c496890d4b92ccee31a8dcdbf4dfa004d0206ff0fb"} Apr 24 16:40:21.236414 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:21.236390 2581 generic.go:358] "Generic (PLEG): container finished" podID="f5274be0-cad9-4355-bcd9-8ca6089639d0" containerID="85844a3d07441caca511087746322bf708b0a394bbcbcf02263c64ed0b108184" exitCode=0 Apr 24 16:40:21.236551 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:21.236424 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f5274be0-cad9-4355-bcd9-8ca6089639d0","Type":"ContainerDied","Data":"85844a3d07441caca511087746322bf708b0a394bbcbcf02263c64ed0b108184"} Apr 24 16:40:21.236551 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:21.236444 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f5274be0-cad9-4355-bcd9-8ca6089639d0","Type":"ContainerStarted","Data":"9b545153c36ebf9dbf75c686389300d4a9334ee106007c0bf6f38e0f571ac163"} Apr 24 16:40:21.244807 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:21.244762 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" podStartSLOduration=1.639258228 podStartE2EDuration="3.244746442s" podCreationTimestamp="2026-04-24 16:40:18 +0000 UTC" firstStartedPulling="2026-04-24 16:40:19.015407363 +0000 UTC m=+54.701131592" lastFinishedPulling="2026-04-24 16:40:20.620895543 +0000 UTC m=+56.306619806" observedRunningTime="2026-04-24 16:40:21.243312203 +0000 UTC m=+56.929036446" watchObservedRunningTime="2026-04-24 16:40:21.244746442 +0000 UTC m=+56.930470681" Apr 24 16:40:21.293022 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:21.292965 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.725290273 podStartE2EDuration="6.292949805s" podCreationTimestamp="2026-04-24 16:40:15 +0000 UTC" firstStartedPulling="2026-04-24 16:40:16.475084386 +0000 UTC m=+52.160808615" lastFinishedPulling="2026-04-24 16:40:20.042743909 +0000 UTC m=+55.728468147" observedRunningTime="2026-04-24 16:40:21.290937633 +0000 UTC m=+56.976661876" watchObservedRunningTime="2026-04-24 16:40:21.292949805 +0000 UTC m=+56.978674053" Apr 24 16:40:21.310317 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:21.310254 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" podStartSLOduration=2.044776073 podStartE2EDuration="5.310232851s" podCreationTimestamp="2026-04-24 16:40:16 +0000 UTC" firstStartedPulling="2026-04-24 16:40:16.777896068 +0000 UTC m=+52.463620293" lastFinishedPulling="2026-04-24 16:40:20.043352847 +0000 UTC m=+55.729077071" observedRunningTime="2026-04-24 16:40:21.308997528 +0000 UTC m=+56.994721781" watchObservedRunningTime="2026-04-24 16:40:21.310232851 +0000 UTC m=+56.995957163" Apr 24 16:40:23.109718 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:23.109689 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k59gs" Apr 24 16:40:24.250181 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:24.250146 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f5274be0-cad9-4355-bcd9-8ca6089639d0","Type":"ContainerStarted","Data":"30eceacb58cec7e73ddb9cb18fad32add156e33f57fecd734273a10b3c972bf9"} Apr 24 16:40:24.250465 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:24.250185 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f5274be0-cad9-4355-bcd9-8ca6089639d0","Type":"ContainerStarted","Data":"941b74e6e35bbacde950d3640e55686e967369d4dbb5915683d5dfa634ce3de8"} Apr 24 16:40:25.257937 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:25.257898 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f5274be0-cad9-4355-bcd9-8ca6089639d0","Type":"ContainerStarted","Data":"91ed93d31573550acb33e25b4a63894cc4be01b192c45ad8762a8e087a3a10b8"} Apr 24 16:40:25.257937 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:25.257935 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f5274be0-cad9-4355-bcd9-8ca6089639d0","Type":"ContainerStarted","Data":"f2a5e3bc2b24eb65d9cbb41f12512c7a83826ed896bb51d64787862f1eb3f7b0"} Apr 24 16:40:25.257937 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:25.257944 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f5274be0-cad9-4355-bcd9-8ca6089639d0","Type":"ContainerStarted","Data":"d0059f352f30ef8ad30b312f53cccfddddf116ea93d4b4a6ebb1060f5f48282f"} Apr 24 16:40:25.258530 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:25.257953 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f5274be0-cad9-4355-bcd9-8ca6089639d0","Type":"ContainerStarted","Data":"ede88a881ae7e114f9c05e59e578283df1b8f9c420e9560b6a3782d062493d65"} Apr 24 16:40:25.288154 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:25.288083 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.417831125 podStartE2EDuration="5.288062082s" podCreationTimestamp="2026-04-24 16:40:20 +0000 UTC" firstStartedPulling="2026-04-24 16:40:21.237655007 +0000 UTC m=+56.923379228" lastFinishedPulling="2026-04-24 16:40:24.107885948 +0000 UTC m=+59.793610185" observedRunningTime="2026-04-24 16:40:25.284939193 +0000 UTC m=+60.970663429" watchObservedRunningTime="2026-04-24 16:40:25.288062082 +0000 UTC m=+60.973786329" Apr 24 16:40:25.800789 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:25.800754 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:27.248802 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:27.248774 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-68b58d564c-vmd2m" Apr 24 16:40:29.614643 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:29.614602 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs\") pod \"network-metrics-daemon-q5b2h\" (UID: \"d85b39e7-4145-4783-a50d-e94999b43e90\") " pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:40:29.616900 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:29.616878 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 16:40:29.627932 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:29.627905 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d85b39e7-4145-4783-a50d-e94999b43e90-metrics-certs\") pod \"network-metrics-daemon-q5b2h\" (UID: \"d85b39e7-4145-4783-a50d-e94999b43e90\") " pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:40:29.715818 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:29.715775 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7gbg\" (UniqueName: \"kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg\") pod \"network-check-target-9wjxs\" (UID: \"4fda4ceb-5ea7-4202-903b-a9a5b5152485\") " pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:40:29.718380 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:29.718359 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 16:40:29.728131 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:29.728107 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 16:40:29.739706 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:29.739687 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7gbg\" (UniqueName: \"kubernetes.io/projected/4fda4ceb-5ea7-4202-903b-a9a5b5152485-kube-api-access-l7gbg\") pod \"network-check-target-9wjxs\" (UID: \"4fda4ceb-5ea7-4202-903b-a9a5b5152485\") " pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:40:29.899309 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:29.899239 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fcvt5\"" Apr 24 16:40:29.900741 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:29.900721 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-j4znz\"" Apr 24 16:40:29.907428 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:29.907406 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:40:29.909178 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:29.909154 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q5b2h" Apr 24 16:40:30.044651 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:30.044577 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q5b2h"] Apr 24 16:40:30.048255 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:40:30.048227 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd85b39e7_4145_4783_a50d_e94999b43e90.slice/crio-39db4746407bf7f56dafd93795d8c6e4668ba208272b35bb2f6fad3963eaa253 WatchSource:0}: Error finding container 39db4746407bf7f56dafd93795d8c6e4668ba208272b35bb2f6fad3963eaa253: Status 404 returned error can't find the container with id 39db4746407bf7f56dafd93795d8c6e4668ba208272b35bb2f6fad3963eaa253 Apr 24 16:40:30.063053 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:30.063023 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9wjxs"] Apr 24 16:40:30.067202 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:40:30.067179 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fda4ceb_5ea7_4202_903b_a9a5b5152485.slice/crio-9d5ac83a750d2c4bfcd260745b32252984bf5029d924cde0f1adf45f48bdc59f WatchSource:0}: Error finding container 9d5ac83a750d2c4bfcd260745b32252984bf5029d924cde0f1adf45f48bdc59f: Status 404 returned error can't find the container with id 9d5ac83a750d2c4bfcd260745b32252984bf5029d924cde0f1adf45f48bdc59f Apr 24 16:40:30.104709 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:30.104679 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c78b78c5c-vpc78"] Apr 24 16:40:30.274285 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:30.274249 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q5b2h" event={"ID":"d85b39e7-4145-4783-a50d-e94999b43e90","Type":"ContainerStarted","Data":"39db4746407bf7f56dafd93795d8c6e4668ba208272b35bb2f6fad3963eaa253"} Apr 24 16:40:30.275279 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:30.275258 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9wjxs" event={"ID":"4fda4ceb-5ea7-4202-903b-a9a5b5152485","Type":"ContainerStarted","Data":"9d5ac83a750d2c4bfcd260745b32252984bf5029d924cde0f1adf45f48bdc59f"} Apr 24 16:40:32.283966 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:32.283930 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q5b2h" event={"ID":"d85b39e7-4145-4783-a50d-e94999b43e90","Type":"ContainerStarted","Data":"060cccad59c5e0ba077782916aed43cfe621bf600f4c9711f24940e58b33f13f"} Apr 24 16:40:33.288668 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:33.288633 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9wjxs" event={"ID":"4fda4ceb-5ea7-4202-903b-a9a5b5152485","Type":"ContainerStarted","Data":"73c49e482fe633215f4a7a96e50911227850312a06bb4a5e7b555784476fb1ec"} Apr 24 16:40:33.289105 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:33.288723 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:40:33.290307 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:33.290278 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q5b2h" event={"ID":"d85b39e7-4145-4783-a50d-e94999b43e90","Type":"ContainerStarted","Data":"6fa7b5420c518ff36d60f4599f36aeb90c518d7a3adb1524ea1169e55c905aa3"} Apr 24 16:40:33.307526 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:33.307447 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9wjxs" podStartSLOduration=65.397079335 podStartE2EDuration="1m8.307431274s" podCreationTimestamp="2026-04-24 16:39:25 +0000 UTC" firstStartedPulling="2026-04-24 16:40:30.069058392 +0000 UTC m=+65.754782614" lastFinishedPulling="2026-04-24 16:40:32.979410318 +0000 UTC m=+68.665134553" observedRunningTime="2026-04-24 16:40:33.305582306 +0000 UTC m=+68.991306550" watchObservedRunningTime="2026-04-24 16:40:33.307431274 +0000 UTC m=+68.993155517" Apr 24 16:40:33.320570 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:33.320514 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-q5b2h" podStartSLOduration=66.623451111 podStartE2EDuration="1m8.320482472s" podCreationTimestamp="2026-04-24 16:39:25 +0000 UTC" firstStartedPulling="2026-04-24 16:40:30.050229302 +0000 UTC m=+65.735953524" lastFinishedPulling="2026-04-24 16:40:31.747260662 +0000 UTC m=+67.432984885" observedRunningTime="2026-04-24 16:40:33.319394399 +0000 UTC m=+69.005118646" watchObservedRunningTime="2026-04-24 16:40:33.320482472 +0000 UTC m=+69.006206715" Apr 24 16:40:38.850833 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:38.850711 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:38.850833 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:38.850754 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:45.249232 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.249173 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-55844dbbbc-8kmwb" podUID="bdea3f22-e17b-4388-bfaf-1282b94d2b83" containerName="console" containerID="cri-o://338ffe559b21e4ff6850ed5b731ed266702f183da7d06f306fc60fd3484251d5" gracePeriod=15 Apr 24 16:40:45.496966 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.496941 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55844dbbbc-8kmwb_bdea3f22-e17b-4388-bfaf-1282b94d2b83/console/0.log" Apr 24 16:40:45.497106 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.497017 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:45.568899 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.568811 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:45.588244 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.588218 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:45.664813 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.664778 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-oauth-serving-cert\") pod \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " Apr 24 16:40:45.664813 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.664819 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-service-ca\") pod \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " Apr 24 16:40:45.665068 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.664879 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-console-config\") pod \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " Apr 24 16:40:45.665068 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.664917 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-trusted-ca-bundle\") pod \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " Apr 24 16:40:45.665068 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.665030 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdea3f22-e17b-4388-bfaf-1282b94d2b83-console-oauth-config\") pod \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " Apr 24 16:40:45.665214 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.665066 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdea3f22-e17b-4388-bfaf-1282b94d2b83-console-serving-cert\") pod \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " Apr 24 16:40:45.665214 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.665102 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9292v\" (UniqueName: \"kubernetes.io/projected/bdea3f22-e17b-4388-bfaf-1282b94d2b83-kube-api-access-9292v\") pod \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\" (UID: \"bdea3f22-e17b-4388-bfaf-1282b94d2b83\") " Apr 24 16:40:45.665388 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.665337 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-service-ca" (OuterVolumeSpecName: "service-ca") pod "bdea3f22-e17b-4388-bfaf-1282b94d2b83" (UID: "bdea3f22-e17b-4388-bfaf-1282b94d2b83"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:40:45.665388 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.665374 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-console-config" (OuterVolumeSpecName: "console-config") pod "bdea3f22-e17b-4388-bfaf-1282b94d2b83" (UID: "bdea3f22-e17b-4388-bfaf-1282b94d2b83"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:40:45.665388 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.665347 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bdea3f22-e17b-4388-bfaf-1282b94d2b83" (UID: "bdea3f22-e17b-4388-bfaf-1282b94d2b83"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:40:45.665596 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.665424 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bdea3f22-e17b-4388-bfaf-1282b94d2b83" (UID: "bdea3f22-e17b-4388-bfaf-1282b94d2b83"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:40:45.666240 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.665967 2581 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-oauth-serving-cert\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:40:45.666240 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.665994 2581 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-service-ca\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:40:45.666240 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.666012 2581 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-console-config\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:40:45.666240 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.666027 2581 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdea3f22-e17b-4388-bfaf-1282b94d2b83-trusted-ca-bundle\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:40:45.667899 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.667864 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdea3f22-e17b-4388-bfaf-1282b94d2b83-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bdea3f22-e17b-4388-bfaf-1282b94d2b83" (UID: "bdea3f22-e17b-4388-bfaf-1282b94d2b83"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:40:45.667993 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.667948 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdea3f22-e17b-4388-bfaf-1282b94d2b83-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bdea3f22-e17b-4388-bfaf-1282b94d2b83" (UID: "bdea3f22-e17b-4388-bfaf-1282b94d2b83"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:40:45.668089 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.668066 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdea3f22-e17b-4388-bfaf-1282b94d2b83-kube-api-access-9292v" (OuterVolumeSpecName: "kube-api-access-9292v") pod "bdea3f22-e17b-4388-bfaf-1282b94d2b83" (UID: "bdea3f22-e17b-4388-bfaf-1282b94d2b83"). InnerVolumeSpecName "kube-api-access-9292v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:40:45.766784 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.766750 2581 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdea3f22-e17b-4388-bfaf-1282b94d2b83-console-oauth-config\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:40:45.766784 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.766777 2581 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdea3f22-e17b-4388-bfaf-1282b94d2b83-console-serving-cert\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:40:45.766784 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.766789 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9292v\" (UniqueName: \"kubernetes.io/projected/bdea3f22-e17b-4388-bfaf-1282b94d2b83-kube-api-access-9292v\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:40:45.816556 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:45.816527 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:40:46.331210 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:46.331183 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55844dbbbc-8kmwb_bdea3f22-e17b-4388-bfaf-1282b94d2b83/console/0.log" Apr 24 16:40:46.331607 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:46.331223 2581 generic.go:358] "Generic (PLEG): container finished" podID="bdea3f22-e17b-4388-bfaf-1282b94d2b83" containerID="338ffe559b21e4ff6850ed5b731ed266702f183da7d06f306fc60fd3484251d5" exitCode=2 Apr 24 16:40:46.331607 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:46.331299 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55844dbbbc-8kmwb" Apr 24 16:40:46.331607 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:46.331314 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55844dbbbc-8kmwb" event={"ID":"bdea3f22-e17b-4388-bfaf-1282b94d2b83","Type":"ContainerDied","Data":"338ffe559b21e4ff6850ed5b731ed266702f183da7d06f306fc60fd3484251d5"} Apr 24 16:40:46.331607 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:46.331352 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55844dbbbc-8kmwb" event={"ID":"bdea3f22-e17b-4388-bfaf-1282b94d2b83","Type":"ContainerDied","Data":"868829818e56ce87aad369b391c297ddcf40ab000a0dc4eb958bbbf7b7a880c4"} Apr 24 16:40:46.331607 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:46.331378 2581 scope.go:117] "RemoveContainer" containerID="338ffe559b21e4ff6850ed5b731ed266702f183da7d06f306fc60fd3484251d5" Apr 24 16:40:46.340431 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:46.340411 2581 scope.go:117] "RemoveContainer" containerID="338ffe559b21e4ff6850ed5b731ed266702f183da7d06f306fc60fd3484251d5" Apr 24 16:40:46.340747 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:40:46.340727 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"338ffe559b21e4ff6850ed5b731ed266702f183da7d06f306fc60fd3484251d5\": container with ID starting with 338ffe559b21e4ff6850ed5b731ed266702f183da7d06f306fc60fd3484251d5 not found: ID does not exist" containerID="338ffe559b21e4ff6850ed5b731ed266702f183da7d06f306fc60fd3484251d5" Apr 24 16:40:46.340806 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:46.340756 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"338ffe559b21e4ff6850ed5b731ed266702f183da7d06f306fc60fd3484251d5"} err="failed to get container status \"338ffe559b21e4ff6850ed5b731ed266702f183da7d06f306fc60fd3484251d5\": rpc error: code = NotFound desc = could not find container \"338ffe559b21e4ff6850ed5b731ed266702f183da7d06f306fc60fd3484251d5\": container with ID starting with 338ffe559b21e4ff6850ed5b731ed266702f183da7d06f306fc60fd3484251d5 not found: ID does not exist" Apr 24 16:40:46.353002 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:46.352971 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55844dbbbc-8kmwb"] Apr 24 16:40:46.357062 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:46.357035 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-55844dbbbc-8kmwb"] Apr 24 16:40:46.886163 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:46.886124 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdea3f22-e17b-4388-bfaf-1282b94d2b83" path="/var/lib/kubelet/pods/bdea3f22-e17b-4388-bfaf-1282b94d2b83/volumes" Apr 24 16:40:55.123482 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.123441 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7c78b78c5c-vpc78" podUID="f439dc58-a7ce-43d6-b84e-d91b40dec724" containerName="console" containerID="cri-o://5925558218efb1e22363ced315b63f87a5e6928d6f128855fc7bbb8c409f2edb" gracePeriod=15 Apr 24 16:40:55.364874 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.364850 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c78b78c5c-vpc78_f439dc58-a7ce-43d6-b84e-d91b40dec724/console/0.log" Apr 24 16:40:55.365005 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.364888 2581 generic.go:358] "Generic (PLEG): container finished" podID="f439dc58-a7ce-43d6-b84e-d91b40dec724" containerID="5925558218efb1e22363ced315b63f87a5e6928d6f128855fc7bbb8c409f2edb" exitCode=2 Apr 24 16:40:55.365005 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.364924 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c78b78c5c-vpc78" event={"ID":"f439dc58-a7ce-43d6-b84e-d91b40dec724","Type":"ContainerDied","Data":"5925558218efb1e22363ced315b63f87a5e6928d6f128855fc7bbb8c409f2edb"} Apr 24 16:40:55.419987 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.419964 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c78b78c5c-vpc78_f439dc58-a7ce-43d6-b84e-d91b40dec724/console/0.log" Apr 24 16:40:55.420103 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.420026 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:55.452034 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.452005 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f439dc58-a7ce-43d6-b84e-d91b40dec724-service-ca\") pod \"f439dc58-a7ce-43d6-b84e-d91b40dec724\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " Apr 24 16:40:55.452195 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.452052 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-www7w\" (UniqueName: \"kubernetes.io/projected/f439dc58-a7ce-43d6-b84e-d91b40dec724-kube-api-access-www7w\") pod \"f439dc58-a7ce-43d6-b84e-d91b40dec724\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " Apr 24 16:40:55.452195 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.452097 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f439dc58-a7ce-43d6-b84e-d91b40dec724-console-oauth-config\") pod \"f439dc58-a7ce-43d6-b84e-d91b40dec724\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " Apr 24 16:40:55.452195 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.452135 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f439dc58-a7ce-43d6-b84e-d91b40dec724-console-serving-cert\") pod \"f439dc58-a7ce-43d6-b84e-d91b40dec724\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " Apr 24 16:40:55.452316 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.452210 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f439dc58-a7ce-43d6-b84e-d91b40dec724-oauth-serving-cert\") pod \"f439dc58-a7ce-43d6-b84e-d91b40dec724\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " Apr 24 16:40:55.452316 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.452254 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f439dc58-a7ce-43d6-b84e-d91b40dec724-console-config\") pod \"f439dc58-a7ce-43d6-b84e-d91b40dec724\" (UID: \"f439dc58-a7ce-43d6-b84e-d91b40dec724\") " Apr 24 16:40:55.452489 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.452465 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f439dc58-a7ce-43d6-b84e-d91b40dec724-service-ca" (OuterVolumeSpecName: "service-ca") pod "f439dc58-a7ce-43d6-b84e-d91b40dec724" (UID: "f439dc58-a7ce-43d6-b84e-d91b40dec724"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:40:55.452797 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.452769 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f439dc58-a7ce-43d6-b84e-d91b40dec724-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f439dc58-a7ce-43d6-b84e-d91b40dec724" (UID: "f439dc58-a7ce-43d6-b84e-d91b40dec724"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:40:55.452898 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.452779 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f439dc58-a7ce-43d6-b84e-d91b40dec724-console-config" (OuterVolumeSpecName: "console-config") pod "f439dc58-a7ce-43d6-b84e-d91b40dec724" (UID: "f439dc58-a7ce-43d6-b84e-d91b40dec724"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:40:55.454845 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.454813 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f439dc58-a7ce-43d6-b84e-d91b40dec724-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f439dc58-a7ce-43d6-b84e-d91b40dec724" (UID: "f439dc58-a7ce-43d6-b84e-d91b40dec724"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:40:55.454845 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.454831 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f439dc58-a7ce-43d6-b84e-d91b40dec724-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f439dc58-a7ce-43d6-b84e-d91b40dec724" (UID: "f439dc58-a7ce-43d6-b84e-d91b40dec724"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:40:55.454960 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.454822 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f439dc58-a7ce-43d6-b84e-d91b40dec724-kube-api-access-www7w" (OuterVolumeSpecName: "kube-api-access-www7w") pod "f439dc58-a7ce-43d6-b84e-d91b40dec724" (UID: "f439dc58-a7ce-43d6-b84e-d91b40dec724"). InnerVolumeSpecName "kube-api-access-www7w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:40:55.552991 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.552954 2581 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f439dc58-a7ce-43d6-b84e-d91b40dec724-console-serving-cert\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:40:55.552991 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.552986 2581 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f439dc58-a7ce-43d6-b84e-d91b40dec724-oauth-serving-cert\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:40:55.552991 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.552995 2581 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f439dc58-a7ce-43d6-b84e-d91b40dec724-console-config\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:40:55.552991 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.553004 2581 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f439dc58-a7ce-43d6-b84e-d91b40dec724-service-ca\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:40:55.553281 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.553014 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-www7w\" (UniqueName: \"kubernetes.io/projected/f439dc58-a7ce-43d6-b84e-d91b40dec724-kube-api-access-www7w\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:40:55.553281 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:55.553022 2581 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f439dc58-a7ce-43d6-b84e-d91b40dec724-console-oauth-config\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:40:56.368940 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:56.368909 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c78b78c5c-vpc78_f439dc58-a7ce-43d6-b84e-d91b40dec724/console/0.log" Apr 24 16:40:56.369373 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:56.369003 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c78b78c5c-vpc78" event={"ID":"f439dc58-a7ce-43d6-b84e-d91b40dec724","Type":"ContainerDied","Data":"929e483bf3303fc0ed92fb8a0e3db1b86260e524e15e14bc681e723435a8bb26"} Apr 24 16:40:56.369373 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:56.369015 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c78b78c5c-vpc78" Apr 24 16:40:56.369373 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:56.369053 2581 scope.go:117] "RemoveContainer" containerID="5925558218efb1e22363ced315b63f87a5e6928d6f128855fc7bbb8c409f2edb" Apr 24 16:40:56.390057 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:56.390032 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c78b78c5c-vpc78"] Apr 24 16:40:56.393924 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:56.393905 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7c78b78c5c-vpc78"] Apr 24 16:40:56.888864 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:56.888826 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f439dc58-a7ce-43d6-b84e-d91b40dec724" path="/var/lib/kubelet/pods/f439dc58-a7ce-43d6-b84e-d91b40dec724/volumes" Apr 24 16:40:58.856553 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:58.856524 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:40:58.860373 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:40:58.860352 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-67859b89cc-28g4s" Apr 24 16:41:04.296841 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:04.296807 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9wjxs" Apr 24 16:41:07.852334 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.852291 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6b649c869d-lvprs"] Apr 24 16:41:07.852937 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.852913 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdea3f22-e17b-4388-bfaf-1282b94d2b83" containerName="console" Apr 24 16:41:07.853032 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.852940 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdea3f22-e17b-4388-bfaf-1282b94d2b83" containerName="console" Apr 24 16:41:07.853032 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.852969 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f439dc58-a7ce-43d6-b84e-d91b40dec724" containerName="console" Apr 24 16:41:07.853032 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.852978 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="f439dc58-a7ce-43d6-b84e-d91b40dec724" containerName="console" Apr 24 16:41:07.853154 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.853125 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdea3f22-e17b-4388-bfaf-1282b94d2b83" containerName="console" Apr 24 16:41:07.853195 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.853162 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="f439dc58-a7ce-43d6-b84e-d91b40dec724" containerName="console" Apr 24 16:41:07.858722 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.858692 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:07.861113 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.861083 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 24 16:41:07.861582 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.861557 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-tm67b\"" Apr 24 16:41:07.861891 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.861868 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 24 16:41:07.862022 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.862001 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 24 16:41:07.862098 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.862082 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 24 16:41:07.862353 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.862331 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 24 16:41:07.867320 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.867293 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 24 16:41:07.870750 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.870720 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6b649c869d-lvprs"] Apr 24 16:41:07.958417 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.958377 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61283d35-0e23-4324-b440-b03eba0c9a0f-metrics-client-ca\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:07.958417 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.958420 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/61283d35-0e23-4324-b440-b03eba0c9a0f-secret-telemeter-client\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:07.958660 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.958450 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61283d35-0e23-4324-b440-b03eba0c9a0f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:07.958660 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.958565 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61283d35-0e23-4324-b440-b03eba0c9a0f-serving-certs-ca-bundle\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:07.958660 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.958607 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/61283d35-0e23-4324-b440-b03eba0c9a0f-federate-client-tls\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:07.958660 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.958640 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61283d35-0e23-4324-b440-b03eba0c9a0f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:07.958812 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.958707 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/61283d35-0e23-4324-b440-b03eba0c9a0f-telemeter-client-tls\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:07.958812 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:07.958724 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xsxj\" (UniqueName: \"kubernetes.io/projected/61283d35-0e23-4324-b440-b03eba0c9a0f-kube-api-access-9xsxj\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:08.059794 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.059757 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/61283d35-0e23-4324-b440-b03eba0c9a0f-telemeter-client-tls\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:08.059794 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.059799 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xsxj\" (UniqueName: \"kubernetes.io/projected/61283d35-0e23-4324-b440-b03eba0c9a0f-kube-api-access-9xsxj\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:08.060037 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.059841 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61283d35-0e23-4324-b440-b03eba0c9a0f-metrics-client-ca\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:08.060037 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.059861 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/61283d35-0e23-4324-b440-b03eba0c9a0f-secret-telemeter-client\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:08.060037 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.059878 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61283d35-0e23-4324-b440-b03eba0c9a0f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:08.060037 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.059914 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61283d35-0e23-4324-b440-b03eba0c9a0f-serving-certs-ca-bundle\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:08.060037 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.060002 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/61283d35-0e23-4324-b440-b03eba0c9a0f-federate-client-tls\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:08.060277 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.060075 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61283d35-0e23-4324-b440-b03eba0c9a0f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:08.060867 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.060805 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61283d35-0e23-4324-b440-b03eba0c9a0f-serving-certs-ca-bundle\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:08.060867 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.060820 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/61283d35-0e23-4324-b440-b03eba0c9a0f-metrics-client-ca\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:08.061093 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.060960 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61283d35-0e23-4324-b440-b03eba0c9a0f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:08.062480 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.062453 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/61283d35-0e23-4324-b440-b03eba0c9a0f-telemeter-client-tls\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:08.062607 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.062576 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/61283d35-0e23-4324-b440-b03eba0c9a0f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:08.062649 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.062635 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/61283d35-0e23-4324-b440-b03eba0c9a0f-federate-client-tls\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:08.062682 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.062646 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/61283d35-0e23-4324-b440-b03eba0c9a0f-secret-telemeter-client\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:08.067858 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.067835 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xsxj\" (UniqueName: \"kubernetes.io/projected/61283d35-0e23-4324-b440-b03eba0c9a0f-kube-api-access-9xsxj\") pod \"telemeter-client-6b649c869d-lvprs\" (UID: \"61283d35-0e23-4324-b440-b03eba0c9a0f\") " pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:08.178916 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.178823 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" Apr 24 16:41:08.308886 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.308855 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6b649c869d-lvprs"] Apr 24 16:41:08.312536 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:41:08.312485 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61283d35_0e23_4324_b440_b03eba0c9a0f.slice/crio-deba7fac3dea11d173504d43f0d4ff06cde6fe5831993a3ed9288c58314a65b7 WatchSource:0}: Error finding container deba7fac3dea11d173504d43f0d4ff06cde6fe5831993a3ed9288c58314a65b7: Status 404 returned error can't find the container with id deba7fac3dea11d173504d43f0d4ff06cde6fe5831993a3ed9288c58314a65b7 Apr 24 16:41:08.417026 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:08.416967 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" event={"ID":"61283d35-0e23-4324-b440-b03eba0c9a0f","Type":"ContainerStarted","Data":"deba7fac3dea11d173504d43f0d4ff06cde6fe5831993a3ed9288c58314a65b7"} Apr 24 16:41:10.425857 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:10.425773 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" event={"ID":"61283d35-0e23-4324-b440-b03eba0c9a0f","Type":"ContainerStarted","Data":"2ae6aafe15e13f0f66cd4c04944df930239c765979c3dcee7ee1ca27970fa91a"} Apr 24 16:41:10.425857 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:10.425811 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" event={"ID":"61283d35-0e23-4324-b440-b03eba0c9a0f","Type":"ContainerStarted","Data":"89d76bc433d7992d586a10f449c71b2ffc4449b3fd0c8f1714cc9feafb0d353f"} Apr 24 16:41:10.425857 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:10.425823 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" event={"ID":"61283d35-0e23-4324-b440-b03eba0c9a0f","Type":"ContainerStarted","Data":"bcf62aebbf984c6c34f77cb1b39d371b4136c9cfaa8af4b1988c7e2128247d01"} Apr 24 16:41:10.449189 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:10.449130 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6b649c869d-lvprs" podStartSLOduration=1.690026993 podStartE2EDuration="3.449112219s" podCreationTimestamp="2026-04-24 16:41:07 +0000 UTC" firstStartedPulling="2026-04-24 16:41:08.314289195 +0000 UTC m=+104.000013431" lastFinishedPulling="2026-04-24 16:41:10.073374425 +0000 UTC m=+105.759098657" observedRunningTime="2026-04-24 16:41:10.447258029 +0000 UTC m=+106.132982275" watchObservedRunningTime="2026-04-24 16:41:10.449112219 +0000 UTC m=+106.134836465" Apr 24 16:41:55.504594 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:55.504558 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-592vt"] Apr 24 16:41:55.508719 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:55.508697 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-592vt" Apr 24 16:41:55.511214 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:55.511194 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 16:41:55.518673 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:55.518651 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-592vt"] Apr 24 16:41:55.565141 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:55.565110 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2f7a291a-6e18-499f-83a3-c48bffb2fdec-kubelet-config\") pod \"global-pull-secret-syncer-592vt\" (UID: \"2f7a291a-6e18-499f-83a3-c48bffb2fdec\") " pod="kube-system/global-pull-secret-syncer-592vt" Apr 24 16:41:55.565141 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:55.565147 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2f7a291a-6e18-499f-83a3-c48bffb2fdec-dbus\") pod \"global-pull-secret-syncer-592vt\" (UID: \"2f7a291a-6e18-499f-83a3-c48bffb2fdec\") " pod="kube-system/global-pull-secret-syncer-592vt" Apr 24 16:41:55.565351 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:55.565225 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f7a291a-6e18-499f-83a3-c48bffb2fdec-original-pull-secret\") pod \"global-pull-secret-syncer-592vt\" (UID: \"2f7a291a-6e18-499f-83a3-c48bffb2fdec\") " pod="kube-system/global-pull-secret-syncer-592vt" Apr 24 16:41:55.666666 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:55.666626 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2f7a291a-6e18-499f-83a3-c48bffb2fdec-dbus\") pod \"global-pull-secret-syncer-592vt\" (UID: \"2f7a291a-6e18-499f-83a3-c48bffb2fdec\") " pod="kube-system/global-pull-secret-syncer-592vt" Apr 24 16:41:55.666827 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:55.666787 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f7a291a-6e18-499f-83a3-c48bffb2fdec-original-pull-secret\") pod \"global-pull-secret-syncer-592vt\" (UID: \"2f7a291a-6e18-499f-83a3-c48bffb2fdec\") " pod="kube-system/global-pull-secret-syncer-592vt" Apr 24 16:41:55.666868 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:55.666831 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2f7a291a-6e18-499f-83a3-c48bffb2fdec-dbus\") pod \"global-pull-secret-syncer-592vt\" (UID: \"2f7a291a-6e18-499f-83a3-c48bffb2fdec\") " pod="kube-system/global-pull-secret-syncer-592vt" Apr 24 16:41:55.666928 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:55.666912 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2f7a291a-6e18-499f-83a3-c48bffb2fdec-kubelet-config\") pod \"global-pull-secret-syncer-592vt\" (UID: \"2f7a291a-6e18-499f-83a3-c48bffb2fdec\") " pod="kube-system/global-pull-secret-syncer-592vt" Apr 24 16:41:55.666993 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:55.666981 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2f7a291a-6e18-499f-83a3-c48bffb2fdec-kubelet-config\") pod \"global-pull-secret-syncer-592vt\" (UID: \"2f7a291a-6e18-499f-83a3-c48bffb2fdec\") " pod="kube-system/global-pull-secret-syncer-592vt" Apr 24 16:41:55.669034 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:55.669015 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f7a291a-6e18-499f-83a3-c48bffb2fdec-original-pull-secret\") pod \"global-pull-secret-syncer-592vt\" (UID: \"2f7a291a-6e18-499f-83a3-c48bffb2fdec\") " pod="kube-system/global-pull-secret-syncer-592vt" Apr 24 16:41:55.818306 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:55.818229 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-592vt" Apr 24 16:41:55.936570 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:55.936543 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-592vt"] Apr 24 16:41:55.939692 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:41:55.939662 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f7a291a_6e18_499f_83a3_c48bffb2fdec.slice/crio-740bac8df5a84d1786ec26b5c61c77ea3774b7ef4488d8033131cef49b7e38a4 WatchSource:0}: Error finding container 740bac8df5a84d1786ec26b5c61c77ea3774b7ef4488d8033131cef49b7e38a4: Status 404 returned error can't find the container with id 740bac8df5a84d1786ec26b5c61c77ea3774b7ef4488d8033131cef49b7e38a4 Apr 24 16:41:56.565829 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:56.565794 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-592vt" event={"ID":"2f7a291a-6e18-499f-83a3-c48bffb2fdec","Type":"ContainerStarted","Data":"740bac8df5a84d1786ec26b5c61c77ea3774b7ef4488d8033131cef49b7e38a4"} Apr 24 16:41:59.576522 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:59.576462 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-592vt" event={"ID":"2f7a291a-6e18-499f-83a3-c48bffb2fdec","Type":"ContainerStarted","Data":"9bbd6f9a2428d60c663252719d1cd6cf0fb778e97fdd034ea0b59ba9836bde32"} Apr 24 16:41:59.594932 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:41:59.594886 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-592vt" podStartSLOduration=1.122430903 podStartE2EDuration="4.594870174s" podCreationTimestamp="2026-04-24 16:41:55 +0000 UTC" firstStartedPulling="2026-04-24 16:41:55.941257999 +0000 UTC m=+151.626982219" lastFinishedPulling="2026-04-24 16:41:59.413697249 +0000 UTC m=+155.099421490" observedRunningTime="2026-04-24 16:41:59.59313418 +0000 UTC m=+155.278858423" watchObservedRunningTime="2026-04-24 16:41:59.594870174 +0000 UTC m=+155.280594417" Apr 24 16:43:52.733817 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:52.733738 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-w2nx7"] Apr 24 16:43:52.736943 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:52.736926 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-w2nx7" Apr 24 16:43:52.739678 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:52.739658 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 16:43:52.740616 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:52.740595 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-7lb86\"" Apr 24 16:43:52.740804 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:52.740617 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 16:43:52.740892 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:52.740662 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 16:43:52.746405 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:52.746380 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-w2nx7"] Apr 24 16:43:52.867481 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:52.867435 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3ea16ba-f3a9-4183-b7a4-b080b0a08438-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-w2nx7\" (UID: \"f3ea16ba-f3a9-4183-b7a4-b080b0a08438\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-w2nx7" Apr 24 16:43:52.867685 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:52.867486 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w27l\" (UniqueName: \"kubernetes.io/projected/f3ea16ba-f3a9-4183-b7a4-b080b0a08438-kube-api-access-6w27l\") pod \"llmisvc-controller-manager-68cc5db7c4-w2nx7\" (UID: \"f3ea16ba-f3a9-4183-b7a4-b080b0a08438\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-w2nx7" Apr 24 16:43:52.968332 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:52.968296 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6w27l\" (UniqueName: \"kubernetes.io/projected/f3ea16ba-f3a9-4183-b7a4-b080b0a08438-kube-api-access-6w27l\") pod \"llmisvc-controller-manager-68cc5db7c4-w2nx7\" (UID: \"f3ea16ba-f3a9-4183-b7a4-b080b0a08438\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-w2nx7" Apr 24 16:43:52.968490 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:52.968389 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3ea16ba-f3a9-4183-b7a4-b080b0a08438-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-w2nx7\" (UID: \"f3ea16ba-f3a9-4183-b7a4-b080b0a08438\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-w2nx7" Apr 24 16:43:52.970792 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:52.970767 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3ea16ba-f3a9-4183-b7a4-b080b0a08438-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-w2nx7\" (UID: \"f3ea16ba-f3a9-4183-b7a4-b080b0a08438\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-w2nx7" Apr 24 16:43:52.978274 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:52.978243 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w27l\" (UniqueName: \"kubernetes.io/projected/f3ea16ba-f3a9-4183-b7a4-b080b0a08438-kube-api-access-6w27l\") pod \"llmisvc-controller-manager-68cc5db7c4-w2nx7\" (UID: \"f3ea16ba-f3a9-4183-b7a4-b080b0a08438\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-w2nx7" Apr 24 16:43:53.048339 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:53.048250 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-w2nx7" Apr 24 16:43:53.172529 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:53.172395 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-w2nx7"] Apr 24 16:43:53.175302 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:43:53.175272 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf3ea16ba_f3a9_4183_b7a4_b080b0a08438.slice/crio-26a15682847dd8ad51ab9b01940633b77d641513c80a89cec6c1ac943eba8274 WatchSource:0}: Error finding container 26a15682847dd8ad51ab9b01940633b77d641513c80a89cec6c1ac943eba8274: Status 404 returned error can't find the container with id 26a15682847dd8ad51ab9b01940633b77d641513c80a89cec6c1ac943eba8274 Apr 24 16:43:53.909458 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:53.909398 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-w2nx7" event={"ID":"f3ea16ba-f3a9-4183-b7a4-b080b0a08438","Type":"ContainerStarted","Data":"26a15682847dd8ad51ab9b01940633b77d641513c80a89cec6c1ac943eba8274"} Apr 24 16:43:55.916184 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:55.916151 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-w2nx7" event={"ID":"f3ea16ba-f3a9-4183-b7a4-b080b0a08438","Type":"ContainerStarted","Data":"9b28740200baed19f8392abec9bcbe5cbb0c00d707187b1432c5cfef091b2198"} Apr 24 16:43:55.916620 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:55.916298 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-w2nx7" Apr 24 16:43:55.934990 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:43:55.934929 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-w2nx7" podStartSLOduration=2.021647547 podStartE2EDuration="3.934913634s" podCreationTimestamp="2026-04-24 16:43:52 +0000 UTC" firstStartedPulling="2026-04-24 16:43:53.176554541 +0000 UTC m=+268.862278763" lastFinishedPulling="2026-04-24 16:43:55.08982063 +0000 UTC m=+270.775544850" observedRunningTime="2026-04-24 16:43:55.933184423 +0000 UTC m=+271.618908666" watchObservedRunningTime="2026-04-24 16:43:55.934913634 +0000 UTC m=+271.620637874" Apr 24 16:44:24.791607 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:44:24.791581 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 16:44:24.792340 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:44:24.791586 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 16:44:24.796721 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:44:24.796700 2581 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 16:44:26.921619 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:44:26.921588 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-w2nx7" Apr 24 16:45:17.703836 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:17.703793 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-jlt9x"] Apr 24 16:45:17.707013 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:17.706996 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jlt9x" Apr 24 16:45:17.709458 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:17.709438 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 16:45:17.709581 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:17.709437 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-j66k8\"" Apr 24 16:45:17.713866 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:17.713841 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-jlt9x"] Apr 24 16:45:17.789898 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:17.789856 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnsh9\" (UniqueName: \"kubernetes.io/projected/d91a6800-a301-4363-b2e0-90a3d90c9242-kube-api-access-rnsh9\") pod \"s3-init-jlt9x\" (UID: \"d91a6800-a301-4363-b2e0-90a3d90c9242\") " pod="kserve/s3-init-jlt9x" Apr 24 16:45:17.890990 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:17.890949 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnsh9\" (UniqueName: \"kubernetes.io/projected/d91a6800-a301-4363-b2e0-90a3d90c9242-kube-api-access-rnsh9\") pod \"s3-init-jlt9x\" (UID: \"d91a6800-a301-4363-b2e0-90a3d90c9242\") " pod="kserve/s3-init-jlt9x" Apr 24 16:45:17.899343 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:17.899314 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnsh9\" (UniqueName: \"kubernetes.io/projected/d91a6800-a301-4363-b2e0-90a3d90c9242-kube-api-access-rnsh9\") pod \"s3-init-jlt9x\" (UID: \"d91a6800-a301-4363-b2e0-90a3d90c9242\") " pod="kserve/s3-init-jlt9x" Apr 24 16:45:18.029681 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:18.029632 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jlt9x" Apr 24 16:45:18.147999 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:18.147967 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-jlt9x"] Apr 24 16:45:18.151577 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:45:18.151545 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd91a6800_a301_4363_b2e0_90a3d90c9242.slice/crio-ae2d97f3e6344b3d414d050a5f4b4804ad3da81bbe581611efb88dc557b07749 WatchSource:0}: Error finding container ae2d97f3e6344b3d414d050a5f4b4804ad3da81bbe581611efb88dc557b07749: Status 404 returned error can't find the container with id ae2d97f3e6344b3d414d050a5f4b4804ad3da81bbe581611efb88dc557b07749 Apr 24 16:45:18.153818 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:18.153800 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:45:19.152887 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:19.152842 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jlt9x" event={"ID":"d91a6800-a301-4363-b2e0-90a3d90c9242","Type":"ContainerStarted","Data":"ae2d97f3e6344b3d414d050a5f4b4804ad3da81bbe581611efb88dc557b07749"} Apr 24 16:45:23.165992 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:23.165955 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jlt9x" event={"ID":"d91a6800-a301-4363-b2e0-90a3d90c9242","Type":"ContainerStarted","Data":"87cb83a620b59cba35e67abecc415b720eb6e0e4860812ea6d02dd163163dd91"} Apr 24 16:45:23.188861 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:23.188803 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-jlt9x" podStartSLOduration=1.708844945 podStartE2EDuration="6.188785754s" podCreationTimestamp="2026-04-24 16:45:17 +0000 UTC" firstStartedPulling="2026-04-24 16:45:18.153937351 +0000 UTC m=+353.839661576" lastFinishedPulling="2026-04-24 16:45:22.633878146 +0000 UTC m=+358.319602385" observedRunningTime="2026-04-24 16:45:23.186760698 +0000 UTC m=+358.872484942" watchObservedRunningTime="2026-04-24 16:45:23.188785754 +0000 UTC m=+358.874509996" Apr 24 16:45:26.176116 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:26.176028 2581 generic.go:358] "Generic (PLEG): container finished" podID="d91a6800-a301-4363-b2e0-90a3d90c9242" containerID="87cb83a620b59cba35e67abecc415b720eb6e0e4860812ea6d02dd163163dd91" exitCode=0 Apr 24 16:45:26.176116 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:26.176097 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jlt9x" event={"ID":"d91a6800-a301-4363-b2e0-90a3d90c9242","Type":"ContainerDied","Data":"87cb83a620b59cba35e67abecc415b720eb6e0e4860812ea6d02dd163163dd91"} Apr 24 16:45:27.297199 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:27.297174 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jlt9x" Apr 24 16:45:27.375812 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:27.375774 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnsh9\" (UniqueName: \"kubernetes.io/projected/d91a6800-a301-4363-b2e0-90a3d90c9242-kube-api-access-rnsh9\") pod \"d91a6800-a301-4363-b2e0-90a3d90c9242\" (UID: \"d91a6800-a301-4363-b2e0-90a3d90c9242\") " Apr 24 16:45:27.382196 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:27.382156 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d91a6800-a301-4363-b2e0-90a3d90c9242-kube-api-access-rnsh9" (OuterVolumeSpecName: "kube-api-access-rnsh9") pod "d91a6800-a301-4363-b2e0-90a3d90c9242" (UID: "d91a6800-a301-4363-b2e0-90a3d90c9242"). InnerVolumeSpecName "kube-api-access-rnsh9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:45:27.476975 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:27.476920 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rnsh9\" (UniqueName: \"kubernetes.io/projected/d91a6800-a301-4363-b2e0-90a3d90c9242-kube-api-access-rnsh9\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:45:28.186750 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:28.186716 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-jlt9x" event={"ID":"d91a6800-a301-4363-b2e0-90a3d90c9242","Type":"ContainerDied","Data":"ae2d97f3e6344b3d414d050a5f4b4804ad3da81bbe581611efb88dc557b07749"} Apr 24 16:45:28.186750 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:28.186754 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae2d97f3e6344b3d414d050a5f4b4804ad3da81bbe581611efb88dc557b07749" Apr 24 16:45:28.186750 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:28.186732 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-jlt9x" Apr 24 16:45:37.916106 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:37.916065 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t"] Apr 24 16:45:37.916730 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:37.916621 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d91a6800-a301-4363-b2e0-90a3d90c9242" containerName="s3-init" Apr 24 16:45:37.916730 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:37.916642 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="d91a6800-a301-4363-b2e0-90a3d90c9242" containerName="s3-init" Apr 24 16:45:37.916857 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:37.916734 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="d91a6800-a301-4363-b2e0-90a3d90c9242" containerName="s3-init" Apr 24 16:45:37.919977 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:37.919953 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:45:37.922314 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:37.922283 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 16:45:37.922450 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:37.922368 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 16:45:37.922450 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:37.922427 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\"" Apr 24 16:45:37.922450 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:37.922439 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-predictor-serving-cert\"" Apr 24 16:45:37.923135 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:37.923120 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-gd6p9\"" Apr 24 16:45:37.930527 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:37.930481 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t"] Apr 24 16:45:38.066909 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.066870 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c05d0d59-f939-480f-8087-d416851e1cea-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t\" (UID: \"c05d0d59-f939-480f-8087-d416851e1cea\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:45:38.067085 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.066924 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c05d0d59-f939-480f-8087-d416851e1cea-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t\" (UID: \"c05d0d59-f939-480f-8087-d416851e1cea\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:45:38.067085 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.067001 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c05d0d59-f939-480f-8087-d416851e1cea-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t\" (UID: \"c05d0d59-f939-480f-8087-d416851e1cea\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:45:38.067085 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.067026 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5vz4\" (UniqueName: \"kubernetes.io/projected/c05d0d59-f939-480f-8087-d416851e1cea-kube-api-access-s5vz4\") pod \"isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t\" (UID: \"c05d0d59-f939-480f-8087-d416851e1cea\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:45:38.168468 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.168376 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c05d0d59-f939-480f-8087-d416851e1cea-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t\" (UID: \"c05d0d59-f939-480f-8087-d416851e1cea\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:45:38.168468 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.168438 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c05d0d59-f939-480f-8087-d416851e1cea-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t\" (UID: \"c05d0d59-f939-480f-8087-d416851e1cea\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:45:38.168468 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.168462 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5vz4\" (UniqueName: \"kubernetes.io/projected/c05d0d59-f939-480f-8087-d416851e1cea-kube-api-access-s5vz4\") pod \"isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t\" (UID: \"c05d0d59-f939-480f-8087-d416851e1cea\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:45:38.168734 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.168529 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c05d0d59-f939-480f-8087-d416851e1cea-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t\" (UID: \"c05d0d59-f939-480f-8087-d416851e1cea\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:45:38.168885 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.168862 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c05d0d59-f939-480f-8087-d416851e1cea-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t\" (UID: \"c05d0d59-f939-480f-8087-d416851e1cea\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:45:38.169124 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.169094 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c05d0d59-f939-480f-8087-d416851e1cea-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t\" (UID: \"c05d0d59-f939-480f-8087-d416851e1cea\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:45:38.170970 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.170942 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c05d0d59-f939-480f-8087-d416851e1cea-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t\" (UID: \"c05d0d59-f939-480f-8087-d416851e1cea\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:45:38.176626 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.176602 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5vz4\" (UniqueName: \"kubernetes.io/projected/c05d0d59-f939-480f-8087-d416851e1cea-kube-api-access-s5vz4\") pod \"isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t\" (UID: \"c05d0d59-f939-480f-8087-d416851e1cea\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:45:38.231671 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.231629 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:45:38.363194 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.363158 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t"] Apr 24 16:45:38.366785 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:45:38.366753 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc05d0d59_f939_480f_8087_d416851e1cea.slice/crio-935ec30d88c67c13d13cd2a653ee086bc4c387ef6309a9f183961782d21e3423 WatchSource:0}: Error finding container 935ec30d88c67c13d13cd2a653ee086bc4c387ef6309a9f183961782d21e3423: Status 404 returned error can't find the container with id 935ec30d88c67c13d13cd2a653ee086bc4c387ef6309a9f183961782d21e3423 Apr 24 16:45:38.924066 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.924029 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r"] Apr 24 16:45:38.930130 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.930106 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:45:38.932423 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.932400 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-predictor-serving-cert\"" Apr 24 16:45:38.932674 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.932432 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\"" Apr 24 16:45:38.937090 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:38.937065 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r"] Apr 24 16:45:39.075862 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:39.075812 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2637fd52-945a-467d-8c18-3a8918b60faf-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-849696d87b-dzp6r\" (UID: \"2637fd52-945a-467d-8c18-3a8918b60faf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:45:39.076054 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:39.075963 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwp9j\" (UniqueName: \"kubernetes.io/projected/2637fd52-945a-467d-8c18-3a8918b60faf-kube-api-access-gwp9j\") pod \"isvc-sklearn-graph-2-predictor-849696d87b-dzp6r\" (UID: \"2637fd52-945a-467d-8c18-3a8918b60faf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:45:39.076054 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:39.076041 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2637fd52-945a-467d-8c18-3a8918b60faf-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-849696d87b-dzp6r\" (UID: \"2637fd52-945a-467d-8c18-3a8918b60faf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:45:39.076176 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:39.076088 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2637fd52-945a-467d-8c18-3a8918b60faf-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-849696d87b-dzp6r\" (UID: \"2637fd52-945a-467d-8c18-3a8918b60faf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:45:39.176860 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:39.176776 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2637fd52-945a-467d-8c18-3a8918b60faf-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-849696d87b-dzp6r\" (UID: \"2637fd52-945a-467d-8c18-3a8918b60faf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:45:39.177020 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:39.176863 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2637fd52-945a-467d-8c18-3a8918b60faf-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-849696d87b-dzp6r\" (UID: \"2637fd52-945a-467d-8c18-3a8918b60faf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:45:39.177020 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:39.176927 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwp9j\" (UniqueName: \"kubernetes.io/projected/2637fd52-945a-467d-8c18-3a8918b60faf-kube-api-access-gwp9j\") pod \"isvc-sklearn-graph-2-predictor-849696d87b-dzp6r\" (UID: \"2637fd52-945a-467d-8c18-3a8918b60faf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:45:39.177020 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:39.176981 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2637fd52-945a-467d-8c18-3a8918b60faf-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-849696d87b-dzp6r\" (UID: \"2637fd52-945a-467d-8c18-3a8918b60faf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:45:39.177872 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:39.177545 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2637fd52-945a-467d-8c18-3a8918b60faf-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-849696d87b-dzp6r\" (UID: \"2637fd52-945a-467d-8c18-3a8918b60faf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:45:39.177986 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:39.177877 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2637fd52-945a-467d-8c18-3a8918b60faf-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-849696d87b-dzp6r\" (UID: \"2637fd52-945a-467d-8c18-3a8918b60faf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:45:39.179946 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:39.179871 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2637fd52-945a-467d-8c18-3a8918b60faf-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-849696d87b-dzp6r\" (UID: \"2637fd52-945a-467d-8c18-3a8918b60faf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:45:39.186004 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:39.185953 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwp9j\" (UniqueName: \"kubernetes.io/projected/2637fd52-945a-467d-8c18-3a8918b60faf-kube-api-access-gwp9j\") pod \"isvc-sklearn-graph-2-predictor-849696d87b-dzp6r\" (UID: \"2637fd52-945a-467d-8c18-3a8918b60faf\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:45:39.220829 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:39.220786 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" event={"ID":"c05d0d59-f939-480f-8087-d416851e1cea","Type":"ContainerStarted","Data":"935ec30d88c67c13d13cd2a653ee086bc4c387ef6309a9f183961782d21e3423"} Apr 24 16:45:39.250293 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:39.250251 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:45:39.406830 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:39.406796 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r"] Apr 24 16:45:39.409896 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:45:39.409557 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2637fd52_945a_467d_8c18_3a8918b60faf.slice/crio-f55a4e898bc45acbafe2afcc330ef3f42f2b60e7f9562c52231b2428e1a5c7d9 WatchSource:0}: Error finding container f55a4e898bc45acbafe2afcc330ef3f42f2b60e7f9562c52231b2428e1a5c7d9: Status 404 returned error can't find the container with id f55a4e898bc45acbafe2afcc330ef3f42f2b60e7f9562c52231b2428e1a5c7d9 Apr 24 16:45:40.226207 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:40.226157 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" event={"ID":"2637fd52-945a-467d-8c18-3a8918b60faf","Type":"ContainerStarted","Data":"f55a4e898bc45acbafe2afcc330ef3f42f2b60e7f9562c52231b2428e1a5c7d9"} Apr 24 16:45:43.237838 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:43.237797 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" event={"ID":"c05d0d59-f939-480f-8087-d416851e1cea","Type":"ContainerStarted","Data":"b0285c5a3cfb889b727f28779feed6ac7b7e0ae48be23e1d5f90f88361d1cb37"} Apr 24 16:45:43.239137 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:43.239097 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" event={"ID":"2637fd52-945a-467d-8c18-3a8918b60faf","Type":"ContainerStarted","Data":"1e5423692b1fdcd31c3b16612594c132f7abcbd598a399089b15fde226dc2686"} Apr 24 16:45:47.252668 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:47.252632 2581 generic.go:358] "Generic (PLEG): container finished" podID="c05d0d59-f939-480f-8087-d416851e1cea" containerID="b0285c5a3cfb889b727f28779feed6ac7b7e0ae48be23e1d5f90f88361d1cb37" exitCode=0 Apr 24 16:45:47.253136 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:47.252707 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" event={"ID":"c05d0d59-f939-480f-8087-d416851e1cea","Type":"ContainerDied","Data":"b0285c5a3cfb889b727f28779feed6ac7b7e0ae48be23e1d5f90f88361d1cb37"} Apr 24 16:45:47.254143 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:47.254119 2581 generic.go:358] "Generic (PLEG): container finished" podID="2637fd52-945a-467d-8c18-3a8918b60faf" containerID="1e5423692b1fdcd31c3b16612594c132f7abcbd598a399089b15fde226dc2686" exitCode=0 Apr 24 16:45:47.254241 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:45:47.254147 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" event={"ID":"2637fd52-945a-467d-8c18-3a8918b60faf","Type":"ContainerDied","Data":"1e5423692b1fdcd31c3b16612594c132f7abcbd598a399089b15fde226dc2686"} Apr 24 16:46:03.343325 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:03.343284 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" event={"ID":"2637fd52-945a-467d-8c18-3a8918b60faf","Type":"ContainerStarted","Data":"aba6ef65dff4ef271cf817a8426e7d548f170f7254f96833e692692673fe24e9"} Apr 24 16:46:03.345580 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:03.345547 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" event={"ID":"c05d0d59-f939-480f-8087-d416851e1cea","Type":"ContainerStarted","Data":"713dcfe49d627fd2ecbd2a3e01bcff49945498b70ed6e5193d53706e3f9ead89"} Apr 24 16:46:06.359072 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:06.359027 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" event={"ID":"c05d0d59-f939-480f-8087-d416851e1cea","Type":"ContainerStarted","Data":"7223ad60864b9a54ce6a40861d40c7a5754dc871fed2bdcb2bbdcd29b5914330"} Apr 24 16:46:06.359562 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:06.359196 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:46:06.361091 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:06.361062 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" event={"ID":"2637fd52-945a-467d-8c18-3a8918b60faf","Type":"ContainerStarted","Data":"1f0e403111fc85637523a2ad967201077e0450958cbb6f00b6a250cfd551439f"} Apr 24 16:46:06.361242 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:06.361229 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:46:06.392771 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:06.392709 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" podStartSLOduration=2.27438253 podStartE2EDuration="29.392690827s" podCreationTimestamp="2026-04-24 16:45:37 +0000 UTC" firstStartedPulling="2026-04-24 16:45:38.368712131 +0000 UTC m=+374.054436352" lastFinishedPulling="2026-04-24 16:46:05.48702042 +0000 UTC m=+401.172744649" observedRunningTime="2026-04-24 16:46:06.390050561 +0000 UTC m=+402.075774805" watchObservedRunningTime="2026-04-24 16:46:06.392690827 +0000 UTC m=+402.078415071" Apr 24 16:46:06.419408 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:06.419358 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" podStartSLOduration=2.338615903 podStartE2EDuration="28.419342992s" podCreationTimestamp="2026-04-24 16:45:38 +0000 UTC" firstStartedPulling="2026-04-24 16:45:39.412237893 +0000 UTC m=+375.097962119" lastFinishedPulling="2026-04-24 16:46:05.492964988 +0000 UTC m=+401.178689208" observedRunningTime="2026-04-24 16:46:06.418679032 +0000 UTC m=+402.104403276" watchObservedRunningTime="2026-04-24 16:46:06.419342992 +0000 UTC m=+402.105067234" Apr 24 16:46:07.369526 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:07.369452 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:46:07.369914 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:07.369543 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:46:07.369914 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:07.369562 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 16:46:07.370047 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:07.369978 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 16:46:08.368573 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:08.368526 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 16:46:08.368761 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:08.368622 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 16:46:11.652927 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.652894 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b898679b9-52l5m"] Apr 24 16:46:11.733072 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.733028 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b898679b9-52l5m"] Apr 24 16:46:11.733230 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.733151 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.736256 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.736223 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 16:46:11.736406 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.736232 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 16:46:11.737086 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.737066 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 16:46:11.737176 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.737107 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 16:46:11.737176 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.737118 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 16:46:11.737176 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.737137 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 16:46:11.739610 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.739578 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-lz9x2\"" Apr 24 16:46:11.740275 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.740248 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 16:46:11.746049 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.746027 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 16:46:11.798109 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.798074 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1768070-3ece-4943-898e-d6ec601b7510-console-config\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.798283 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.798115 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1768070-3ece-4943-898e-d6ec601b7510-oauth-serving-cert\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.798283 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.798193 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1768070-3ece-4943-898e-d6ec601b7510-trusted-ca-bundle\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.798283 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.798278 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1768070-3ece-4943-898e-d6ec601b7510-console-oauth-config\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.798377 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.798307 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zclp\" (UniqueName: \"kubernetes.io/projected/a1768070-3ece-4943-898e-d6ec601b7510-kube-api-access-5zclp\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.798377 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.798340 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1768070-3ece-4943-898e-d6ec601b7510-console-serving-cert\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.798377 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.798363 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1768070-3ece-4943-898e-d6ec601b7510-service-ca\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.898851 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.898814 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1768070-3ece-4943-898e-d6ec601b7510-console-serving-cert\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.898851 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.898851 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1768070-3ece-4943-898e-d6ec601b7510-service-ca\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.899086 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.898877 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1768070-3ece-4943-898e-d6ec601b7510-console-config\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.899086 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.898903 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1768070-3ece-4943-898e-d6ec601b7510-oauth-serving-cert\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.899086 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.898941 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1768070-3ece-4943-898e-d6ec601b7510-trusted-ca-bundle\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.899086 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.899028 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1768070-3ece-4943-898e-d6ec601b7510-console-oauth-config\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.899086 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.899048 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zclp\" (UniqueName: \"kubernetes.io/projected/a1768070-3ece-4943-898e-d6ec601b7510-kube-api-access-5zclp\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.899674 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.899651 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1768070-3ece-4943-898e-d6ec601b7510-service-ca\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.899840 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.899811 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1768070-3ece-4943-898e-d6ec601b7510-oauth-serving-cert\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.899909 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.899856 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1768070-3ece-4943-898e-d6ec601b7510-console-config\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.899950 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.899932 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1768070-3ece-4943-898e-d6ec601b7510-trusted-ca-bundle\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.901383 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.901358 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1768070-3ece-4943-898e-d6ec601b7510-console-oauth-config\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.901464 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.901417 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1768070-3ece-4943-898e-d6ec601b7510-console-serving-cert\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:11.909735 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:11.909681 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zclp\" (UniqueName: \"kubernetes.io/projected/a1768070-3ece-4943-898e-d6ec601b7510-kube-api-access-5zclp\") pod \"console-7b898679b9-52l5m\" (UID: \"a1768070-3ece-4943-898e-d6ec601b7510\") " pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:12.044676 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:12.044634 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:12.183184 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:12.183102 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b898679b9-52l5m"] Apr 24 16:46:12.187923 ip-10-0-143-104 kubenswrapper[2581]: W0424 16:46:12.187895 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1768070_3ece_4943_898e_d6ec601b7510.slice/crio-bfbea66bd9080546779b7af5991bf2a69f8f25bb91c9d4cda6d7b73ffa20f491 WatchSource:0}: Error finding container bfbea66bd9080546779b7af5991bf2a69f8f25bb91c9d4cda6d7b73ffa20f491: Status 404 returned error can't find the container with id bfbea66bd9080546779b7af5991bf2a69f8f25bb91c9d4cda6d7b73ffa20f491 Apr 24 16:46:12.382233 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:12.382194 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b898679b9-52l5m" event={"ID":"a1768070-3ece-4943-898e-d6ec601b7510","Type":"ContainerStarted","Data":"86865df57e11c67fb5f2200a65d8c06c61927fcefa11f4d60b8c343d36e9beb8"} Apr 24 16:46:12.382233 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:12.382235 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b898679b9-52l5m" event={"ID":"a1768070-3ece-4943-898e-d6ec601b7510","Type":"ContainerStarted","Data":"bfbea66bd9080546779b7af5991bf2a69f8f25bb91c9d4cda6d7b73ffa20f491"} Apr 24 16:46:12.412264 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:12.412210 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b898679b9-52l5m" podStartSLOduration=1.412194699 podStartE2EDuration="1.412194699s" podCreationTimestamp="2026-04-24 16:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:46:12.410831744 +0000 UTC m=+408.096555991" watchObservedRunningTime="2026-04-24 16:46:12.412194699 +0000 UTC m=+408.097918944" Apr 24 16:46:13.374186 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:13.374152 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:46:13.374605 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:13.374571 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:46:13.374649 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:13.374610 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 16:46:13.375114 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:13.375090 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 16:46:22.044924 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:22.044886 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:22.045430 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:22.045053 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:22.050272 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:22.050246 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:22.417385 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:22.417312 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b898679b9-52l5m" Apr 24 16:46:23.375212 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:23.375166 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 16:46:23.375675 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:23.375170 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 16:46:33.375090 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:33.375043 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 16:46:33.375645 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:33.375047 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 16:46:43.375477 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:43.375382 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 16:46:43.375956 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:43.375382 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 16:46:53.375380 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:53.375339 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 16:46:53.375865 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:46:53.375339 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 16:47:03.375371 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:03.375330 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 24 16:47:03.375371 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:03.375356 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 24 16:47:13.375487 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:13.375453 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:47:13.375997 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:13.375557 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:47:47.959062 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:47.959025 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r"] Apr 24 16:47:47.959626 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:47.959448 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="kserve-container" containerID="cri-o://aba6ef65dff4ef271cf817a8426e7d548f170f7254f96833e692692673fe24e9" gracePeriod=30 Apr 24 16:47:47.959626 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:47.959535 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="kube-rbac-proxy" containerID="cri-o://1f0e403111fc85637523a2ad967201077e0450958cbb6f00b6a250cfd551439f" gracePeriod=30 Apr 24 16:47:47.996997 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:47.996965 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t"] Apr 24 16:47:47.997318 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:47.997280 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="kserve-container" containerID="cri-o://713dcfe49d627fd2ecbd2a3e01bcff49945498b70ed6e5193d53706e3f9ead89" gracePeriod=30 Apr 24 16:47:47.997414 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:47.997333 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="kube-rbac-proxy" containerID="cri-o://7223ad60864b9a54ce6a40861d40c7a5754dc871fed2bdcb2bbdcd29b5914330" gracePeriod=30 Apr 24 16:47:48.369194 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:48.369101 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 24 16:47:48.369194 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:48.369160 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 24 16:47:48.690403 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:48.690313 2581 generic.go:358] "Generic (PLEG): container finished" podID="2637fd52-945a-467d-8c18-3a8918b60faf" containerID="1f0e403111fc85637523a2ad967201077e0450958cbb6f00b6a250cfd551439f" exitCode=2 Apr 24 16:47:48.690403 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:48.690391 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" event={"ID":"2637fd52-945a-467d-8c18-3a8918b60faf","Type":"ContainerDied","Data":"1f0e403111fc85637523a2ad967201077e0450958cbb6f00b6a250cfd551439f"} Apr 24 16:47:48.692231 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:48.692208 2581 generic.go:358] "Generic (PLEG): container finished" podID="c05d0d59-f939-480f-8087-d416851e1cea" containerID="7223ad60864b9a54ce6a40861d40c7a5754dc871fed2bdcb2bbdcd29b5914330" exitCode=2 Apr 24 16:47:48.692341 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:48.692237 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" event={"ID":"c05d0d59-f939-480f-8087-d416851e1cea","Type":"ContainerDied","Data":"7223ad60864b9a54ce6a40861d40c7a5754dc871fed2bdcb2bbdcd29b5914330"} Apr 24 16:47:52.452756 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.452733 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:47:52.455836 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.455811 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:47:52.619424 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.619341 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2637fd52-945a-467d-8c18-3a8918b60faf-proxy-tls\") pod \"2637fd52-945a-467d-8c18-3a8918b60faf\" (UID: \"2637fd52-945a-467d-8c18-3a8918b60faf\") " Apr 24 16:47:52.619584 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.619427 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c05d0d59-f939-480f-8087-d416851e1cea-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"c05d0d59-f939-480f-8087-d416851e1cea\" (UID: \"c05d0d59-f939-480f-8087-d416851e1cea\") " Apr 24 16:47:52.619584 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.619554 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5vz4\" (UniqueName: \"kubernetes.io/projected/c05d0d59-f939-480f-8087-d416851e1cea-kube-api-access-s5vz4\") pod \"c05d0d59-f939-480f-8087-d416851e1cea\" (UID: \"c05d0d59-f939-480f-8087-d416851e1cea\") " Apr 24 16:47:52.619714 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.619595 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwp9j\" (UniqueName: \"kubernetes.io/projected/2637fd52-945a-467d-8c18-3a8918b60faf-kube-api-access-gwp9j\") pod \"2637fd52-945a-467d-8c18-3a8918b60faf\" (UID: \"2637fd52-945a-467d-8c18-3a8918b60faf\") " Apr 24 16:47:52.619714 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.619646 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c05d0d59-f939-480f-8087-d416851e1cea-kserve-provision-location\") pod \"c05d0d59-f939-480f-8087-d416851e1cea\" (UID: \"c05d0d59-f939-480f-8087-d416851e1cea\") " Apr 24 16:47:52.619820 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.619718 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c05d0d59-f939-480f-8087-d416851e1cea-proxy-tls\") pod \"c05d0d59-f939-480f-8087-d416851e1cea\" (UID: \"c05d0d59-f939-480f-8087-d416851e1cea\") " Apr 24 16:47:52.619820 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.619742 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c05d0d59-f939-480f-8087-d416851e1cea-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config") pod "c05d0d59-f939-480f-8087-d416851e1cea" (UID: "c05d0d59-f939-480f-8087-d416851e1cea"). InnerVolumeSpecName "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:47:52.619820 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.619768 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2637fd52-945a-467d-8c18-3a8918b60faf-kserve-provision-location\") pod \"2637fd52-945a-467d-8c18-3a8918b60faf\" (UID: \"2637fd52-945a-467d-8c18-3a8918b60faf\") " Apr 24 16:47:52.619820 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.619811 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2637fd52-945a-467d-8c18-3a8918b60faf-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"2637fd52-945a-467d-8c18-3a8918b60faf\" (UID: \"2637fd52-945a-467d-8c18-3a8918b60faf\") " Apr 24 16:47:52.620028 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.620003 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c05d0d59-f939-480f-8087-d416851e1cea-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c05d0d59-f939-480f-8087-d416851e1cea" (UID: "c05d0d59-f939-480f-8087-d416851e1cea"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:47:52.620112 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.620085 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2637fd52-945a-467d-8c18-3a8918b60faf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2637fd52-945a-467d-8c18-3a8918b60faf" (UID: "2637fd52-945a-467d-8c18-3a8918b60faf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:47:52.620180 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.620160 2581 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c05d0d59-f939-480f-8087-d416851e1cea-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:47:52.620239 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.620186 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c05d0d59-f939-480f-8087-d416851e1cea-kserve-provision-location\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:47:52.620372 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.620343 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2637fd52-945a-467d-8c18-3a8918b60faf-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config") pod "2637fd52-945a-467d-8c18-3a8918b60faf" (UID: "2637fd52-945a-467d-8c18-3a8918b60faf"). InnerVolumeSpecName "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:47:52.621758 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.621727 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c05d0d59-f939-480f-8087-d416851e1cea-kube-api-access-s5vz4" (OuterVolumeSpecName: "kube-api-access-s5vz4") pod "c05d0d59-f939-480f-8087-d416851e1cea" (UID: "c05d0d59-f939-480f-8087-d416851e1cea"). InnerVolumeSpecName "kube-api-access-s5vz4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:47:52.621758 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.621745 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2637fd52-945a-467d-8c18-3a8918b60faf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2637fd52-945a-467d-8c18-3a8918b60faf" (UID: "2637fd52-945a-467d-8c18-3a8918b60faf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:47:52.621891 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.621785 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c05d0d59-f939-480f-8087-d416851e1cea-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c05d0d59-f939-480f-8087-d416851e1cea" (UID: "c05d0d59-f939-480f-8087-d416851e1cea"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:47:52.621891 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.621812 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2637fd52-945a-467d-8c18-3a8918b60faf-kube-api-access-gwp9j" (OuterVolumeSpecName: "kube-api-access-gwp9j") pod "2637fd52-945a-467d-8c18-3a8918b60faf" (UID: "2637fd52-945a-467d-8c18-3a8918b60faf"). InnerVolumeSpecName "kube-api-access-gwp9j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:47:52.705859 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.705823 2581 generic.go:358] "Generic (PLEG): container finished" podID="c05d0d59-f939-480f-8087-d416851e1cea" containerID="713dcfe49d627fd2ecbd2a3e01bcff49945498b70ed6e5193d53706e3f9ead89" exitCode=0 Apr 24 16:47:52.706045 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.705899 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" Apr 24 16:47:52.706045 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.705906 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" event={"ID":"c05d0d59-f939-480f-8087-d416851e1cea","Type":"ContainerDied","Data":"713dcfe49d627fd2ecbd2a3e01bcff49945498b70ed6e5193d53706e3f9ead89"} Apr 24 16:47:52.706045 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.705942 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t" event={"ID":"c05d0d59-f939-480f-8087-d416851e1cea","Type":"ContainerDied","Data":"935ec30d88c67c13d13cd2a653ee086bc4c387ef6309a9f183961782d21e3423"} Apr 24 16:47:52.706045 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.705965 2581 scope.go:117] "RemoveContainer" containerID="7223ad60864b9a54ce6a40861d40c7a5754dc871fed2bdcb2bbdcd29b5914330" Apr 24 16:47:52.707741 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.707719 2581 generic.go:358] "Generic (PLEG): container finished" podID="2637fd52-945a-467d-8c18-3a8918b60faf" containerID="aba6ef65dff4ef271cf817a8426e7d548f170f7254f96833e692692673fe24e9" exitCode=0 Apr 24 16:47:52.707847 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.707783 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" event={"ID":"2637fd52-945a-467d-8c18-3a8918b60faf","Type":"ContainerDied","Data":"aba6ef65dff4ef271cf817a8426e7d548f170f7254f96833e692692673fe24e9"} Apr 24 16:47:52.707847 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.707789 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" Apr 24 16:47:52.707847 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.707808 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r" event={"ID":"2637fd52-945a-467d-8c18-3a8918b60faf","Type":"ContainerDied","Data":"f55a4e898bc45acbafe2afcc330ef3f42f2b60e7f9562c52231b2428e1a5c7d9"} Apr 24 16:47:52.714597 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.714580 2581 scope.go:117] "RemoveContainer" containerID="713dcfe49d627fd2ecbd2a3e01bcff49945498b70ed6e5193d53706e3f9ead89" Apr 24 16:47:52.721143 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.721121 2581 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2637fd52-945a-467d-8c18-3a8918b60faf-proxy-tls\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:47:52.721143 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.721143 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s5vz4\" (UniqueName: \"kubernetes.io/projected/c05d0d59-f939-480f-8087-d416851e1cea-kube-api-access-s5vz4\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:47:52.721259 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.721154 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gwp9j\" (UniqueName: \"kubernetes.io/projected/2637fd52-945a-467d-8c18-3a8918b60faf-kube-api-access-gwp9j\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:47:52.721259 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.721164 2581 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c05d0d59-f939-480f-8087-d416851e1cea-proxy-tls\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:47:52.721259 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.721173 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2637fd52-945a-467d-8c18-3a8918b60faf-kserve-provision-location\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:47:52.721259 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.721182 2581 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2637fd52-945a-467d-8c18-3a8918b60faf-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 16:47:52.722521 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.722492 2581 scope.go:117] "RemoveContainer" containerID="b0285c5a3cfb889b727f28779feed6ac7b7e0ae48be23e1d5f90f88361d1cb37" Apr 24 16:47:52.728925 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.728902 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t"] Apr 24 16:47:52.730696 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.730672 2581 scope.go:117] "RemoveContainer" containerID="7223ad60864b9a54ce6a40861d40c7a5754dc871fed2bdcb2bbdcd29b5914330" Apr 24 16:47:52.730949 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:47:52.730933 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7223ad60864b9a54ce6a40861d40c7a5754dc871fed2bdcb2bbdcd29b5914330\": container with ID starting with 7223ad60864b9a54ce6a40861d40c7a5754dc871fed2bdcb2bbdcd29b5914330 not found: ID does not exist" containerID="7223ad60864b9a54ce6a40861d40c7a5754dc871fed2bdcb2bbdcd29b5914330" Apr 24 16:47:52.730993 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.730958 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7223ad60864b9a54ce6a40861d40c7a5754dc871fed2bdcb2bbdcd29b5914330"} err="failed to get container status \"7223ad60864b9a54ce6a40861d40c7a5754dc871fed2bdcb2bbdcd29b5914330\": rpc error: code = NotFound desc = could not find container \"7223ad60864b9a54ce6a40861d40c7a5754dc871fed2bdcb2bbdcd29b5914330\": container with ID starting with 7223ad60864b9a54ce6a40861d40c7a5754dc871fed2bdcb2bbdcd29b5914330 not found: ID does not exist" Apr 24 16:47:52.730993 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.730975 2581 scope.go:117] "RemoveContainer" containerID="713dcfe49d627fd2ecbd2a3e01bcff49945498b70ed6e5193d53706e3f9ead89" Apr 24 16:47:52.731196 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:47:52.731179 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"713dcfe49d627fd2ecbd2a3e01bcff49945498b70ed6e5193d53706e3f9ead89\": container with ID starting with 713dcfe49d627fd2ecbd2a3e01bcff49945498b70ed6e5193d53706e3f9ead89 not found: ID does not exist" containerID="713dcfe49d627fd2ecbd2a3e01bcff49945498b70ed6e5193d53706e3f9ead89" Apr 24 16:47:52.731240 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.731203 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"713dcfe49d627fd2ecbd2a3e01bcff49945498b70ed6e5193d53706e3f9ead89"} err="failed to get container status \"713dcfe49d627fd2ecbd2a3e01bcff49945498b70ed6e5193d53706e3f9ead89\": rpc error: code = NotFound desc = could not find container \"713dcfe49d627fd2ecbd2a3e01bcff49945498b70ed6e5193d53706e3f9ead89\": container with ID starting with 713dcfe49d627fd2ecbd2a3e01bcff49945498b70ed6e5193d53706e3f9ead89 not found: ID does not exist" Apr 24 16:47:52.731240 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.731217 2581 scope.go:117] "RemoveContainer" containerID="b0285c5a3cfb889b727f28779feed6ac7b7e0ae48be23e1d5f90f88361d1cb37" Apr 24 16:47:52.731400 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:47:52.731386 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0285c5a3cfb889b727f28779feed6ac7b7e0ae48be23e1d5f90f88361d1cb37\": container with ID starting with b0285c5a3cfb889b727f28779feed6ac7b7e0ae48be23e1d5f90f88361d1cb37 not found: ID does not exist" containerID="b0285c5a3cfb889b727f28779feed6ac7b7e0ae48be23e1d5f90f88361d1cb37" Apr 24 16:47:52.731439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.731403 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0285c5a3cfb889b727f28779feed6ac7b7e0ae48be23e1d5f90f88361d1cb37"} err="failed to get container status \"b0285c5a3cfb889b727f28779feed6ac7b7e0ae48be23e1d5f90f88361d1cb37\": rpc error: code = NotFound desc = could not find container \"b0285c5a3cfb889b727f28779feed6ac7b7e0ae48be23e1d5f90f88361d1cb37\": container with ID starting with b0285c5a3cfb889b727f28779feed6ac7b7e0ae48be23e1d5f90f88361d1cb37 not found: ID does not exist" Apr 24 16:47:52.731439 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.731414 2581 scope.go:117] "RemoveContainer" containerID="1f0e403111fc85637523a2ad967201077e0450958cbb6f00b6a250cfd551439f" Apr 24 16:47:52.734932 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.734909 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-64bf5cc5fb-67w2t"] Apr 24 16:47:52.739875 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.739854 2581 scope.go:117] "RemoveContainer" containerID="aba6ef65dff4ef271cf817a8426e7d548f170f7254f96833e692692673fe24e9" Apr 24 16:47:52.746403 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.746380 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r"] Apr 24 16:47:52.749381 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.749360 2581 scope.go:117] "RemoveContainer" containerID="1e5423692b1fdcd31c3b16612594c132f7abcbd598a399089b15fde226dc2686" Apr 24 16:47:52.751943 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.751919 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-849696d87b-dzp6r"] Apr 24 16:47:52.756773 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.756756 2581 scope.go:117] "RemoveContainer" containerID="1f0e403111fc85637523a2ad967201077e0450958cbb6f00b6a250cfd551439f" Apr 24 16:47:52.757051 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:47:52.757031 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f0e403111fc85637523a2ad967201077e0450958cbb6f00b6a250cfd551439f\": container with ID starting with 1f0e403111fc85637523a2ad967201077e0450958cbb6f00b6a250cfd551439f not found: ID does not exist" containerID="1f0e403111fc85637523a2ad967201077e0450958cbb6f00b6a250cfd551439f" Apr 24 16:47:52.757116 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.757065 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f0e403111fc85637523a2ad967201077e0450958cbb6f00b6a250cfd551439f"} err="failed to get container status \"1f0e403111fc85637523a2ad967201077e0450958cbb6f00b6a250cfd551439f\": rpc error: code = NotFound desc = could not find container \"1f0e403111fc85637523a2ad967201077e0450958cbb6f00b6a250cfd551439f\": container with ID starting with 1f0e403111fc85637523a2ad967201077e0450958cbb6f00b6a250cfd551439f not found: ID does not exist" Apr 24 16:47:52.757116 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.757093 2581 scope.go:117] "RemoveContainer" containerID="aba6ef65dff4ef271cf817a8426e7d548f170f7254f96833e692692673fe24e9" Apr 24 16:47:52.757334 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:47:52.757315 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aba6ef65dff4ef271cf817a8426e7d548f170f7254f96833e692692673fe24e9\": container with ID starting with aba6ef65dff4ef271cf817a8426e7d548f170f7254f96833e692692673fe24e9 not found: ID does not exist" containerID="aba6ef65dff4ef271cf817a8426e7d548f170f7254f96833e692692673fe24e9" Apr 24 16:47:52.757376 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.757339 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aba6ef65dff4ef271cf817a8426e7d548f170f7254f96833e692692673fe24e9"} err="failed to get container status \"aba6ef65dff4ef271cf817a8426e7d548f170f7254f96833e692692673fe24e9\": rpc error: code = NotFound desc = could not find container \"aba6ef65dff4ef271cf817a8426e7d548f170f7254f96833e692692673fe24e9\": container with ID starting with aba6ef65dff4ef271cf817a8426e7d548f170f7254f96833e692692673fe24e9 not found: ID does not exist" Apr 24 16:47:52.757376 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.757357 2581 scope.go:117] "RemoveContainer" containerID="1e5423692b1fdcd31c3b16612594c132f7abcbd598a399089b15fde226dc2686" Apr 24 16:47:52.757647 ip-10-0-143-104 kubenswrapper[2581]: E0424 16:47:52.757628 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e5423692b1fdcd31c3b16612594c132f7abcbd598a399089b15fde226dc2686\": container with ID starting with 1e5423692b1fdcd31c3b16612594c132f7abcbd598a399089b15fde226dc2686 not found: ID does not exist" containerID="1e5423692b1fdcd31c3b16612594c132f7abcbd598a399089b15fde226dc2686" Apr 24 16:47:52.757720 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.757655 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e5423692b1fdcd31c3b16612594c132f7abcbd598a399089b15fde226dc2686"} err="failed to get container status \"1e5423692b1fdcd31c3b16612594c132f7abcbd598a399089b15fde226dc2686\": rpc error: code = NotFound desc = could not find container \"1e5423692b1fdcd31c3b16612594c132f7abcbd598a399089b15fde226dc2686\": container with ID starting with 1e5423692b1fdcd31c3b16612594c132f7abcbd598a399089b15fde226dc2686 not found: ID does not exist" Apr 24 16:47:52.885888 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.885811 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" path="/var/lib/kubelet/pods/2637fd52-945a-467d-8c18-3a8918b60faf/volumes" Apr 24 16:47:52.886303 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:47:52.886290 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c05d0d59-f939-480f-8087-d416851e1cea" path="/var/lib/kubelet/pods/c05d0d59-f939-480f-8087-d416851e1cea/volumes" Apr 24 16:49:24.815302 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:49:24.815271 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 16:49:24.816202 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:49:24.816182 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 16:54:24.839191 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:54:24.839154 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 16:54:24.840271 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:54:24.840251 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 16:59:24.862782 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:59:24.862753 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 16:59:24.865826 ip-10-0-143-104 kubenswrapper[2581]: I0424 16:59:24.865803 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 17:04:24.886215 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:04:24.886177 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 17:04:24.890843 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:04:24.890823 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 17:09:24.909074 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:09:24.908976 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 17:09:24.915037 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:09:24.914711 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 17:14:24.931863 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:14:24.931764 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 17:14:24.944271 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:14:24.944246 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 17:19:24.952718 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:19:24.952609 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 17:19:24.967218 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:19:24.967193 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 17:24:24.973991 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:24.973953 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 17:24:24.990711 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:24.990683 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 17:24:48.842772 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.842736 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wwhr9/must-gather-vhs9z"] Apr 24 17:24:48.843200 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.843070 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="storage-initializer" Apr 24 17:24:48.843200 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.843080 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="storage-initializer" Apr 24 17:24:48.843200 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.843092 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="kserve-container" Apr 24 17:24:48.843200 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.843098 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="kserve-container" Apr 24 17:24:48.843200 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.843110 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="kube-rbac-proxy" Apr 24 17:24:48.843200 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.843115 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="kube-rbac-proxy" Apr 24 17:24:48.843200 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.843124 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="kube-rbac-proxy" Apr 24 17:24:48.843200 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.843129 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="kube-rbac-proxy" Apr 24 17:24:48.843200 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.843144 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="kserve-container" Apr 24 17:24:48.843200 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.843150 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="kserve-container" Apr 24 17:24:48.843200 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.843157 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="storage-initializer" Apr 24 17:24:48.843200 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.843162 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="storage-initializer" Apr 24 17:24:48.843571 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.843211 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="kserve-container" Apr 24 17:24:48.843571 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.843220 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="kserve-container" Apr 24 17:24:48.843571 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.843226 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="c05d0d59-f939-480f-8087-d416851e1cea" containerName="kube-rbac-proxy" Apr 24 17:24:48.843571 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.843237 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="2637fd52-945a-467d-8c18-3a8918b60faf" containerName="kube-rbac-proxy" Apr 24 17:24:48.846237 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.846220 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wwhr9/must-gather-vhs9z" Apr 24 17:24:48.848330 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.848304 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wwhr9\"/\"kube-root-ca.crt\"" Apr 24 17:24:48.848330 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.848321 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wwhr9\"/\"openshift-service-ca.crt\"" Apr 24 17:24:48.848957 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.848929 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wwhr9\"/\"default-dockercfg-sz7vl\"" Apr 24 17:24:48.859728 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.859702 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wwhr9/must-gather-vhs9z"] Apr 24 17:24:48.992476 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.992434 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw4dn\" (UniqueName: \"kubernetes.io/projected/9c6f104a-ead6-4069-98d6-eb92870a715a-kube-api-access-qw4dn\") pod \"must-gather-vhs9z\" (UID: \"9c6f104a-ead6-4069-98d6-eb92870a715a\") " pod="openshift-must-gather-wwhr9/must-gather-vhs9z" Apr 24 17:24:48.992664 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:48.992492 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9c6f104a-ead6-4069-98d6-eb92870a715a-must-gather-output\") pod \"must-gather-vhs9z\" (UID: \"9c6f104a-ead6-4069-98d6-eb92870a715a\") " pod="openshift-must-gather-wwhr9/must-gather-vhs9z" Apr 24 17:24:49.093910 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:49.093821 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9c6f104a-ead6-4069-98d6-eb92870a715a-must-gather-output\") pod \"must-gather-vhs9z\" (UID: \"9c6f104a-ead6-4069-98d6-eb92870a715a\") " pod="openshift-must-gather-wwhr9/must-gather-vhs9z" Apr 24 17:24:49.094041 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:49.093921 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw4dn\" (UniqueName: \"kubernetes.io/projected/9c6f104a-ead6-4069-98d6-eb92870a715a-kube-api-access-qw4dn\") pod \"must-gather-vhs9z\" (UID: \"9c6f104a-ead6-4069-98d6-eb92870a715a\") " pod="openshift-must-gather-wwhr9/must-gather-vhs9z" Apr 24 17:24:49.094198 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:49.094177 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9c6f104a-ead6-4069-98d6-eb92870a715a-must-gather-output\") pod \"must-gather-vhs9z\" (UID: \"9c6f104a-ead6-4069-98d6-eb92870a715a\") " pod="openshift-must-gather-wwhr9/must-gather-vhs9z" Apr 24 17:24:49.102533 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:49.102481 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw4dn\" (UniqueName: \"kubernetes.io/projected/9c6f104a-ead6-4069-98d6-eb92870a715a-kube-api-access-qw4dn\") pod \"must-gather-vhs9z\" (UID: \"9c6f104a-ead6-4069-98d6-eb92870a715a\") " pod="openshift-must-gather-wwhr9/must-gather-vhs9z" Apr 24 17:24:49.167988 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:49.167949 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wwhr9/must-gather-vhs9z" Apr 24 17:24:49.288841 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:49.288817 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wwhr9/must-gather-vhs9z"] Apr 24 17:24:49.291430 ip-10-0-143-104 kubenswrapper[2581]: W0424 17:24:49.291393 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c6f104a_ead6_4069_98d6_eb92870a715a.slice/crio-9440c66564104db8620455e733296e719413a028857fcc3c46ffe440d942f107 WatchSource:0}: Error finding container 9440c66564104db8620455e733296e719413a028857fcc3c46ffe440d942f107: Status 404 returned error can't find the container with id 9440c66564104db8620455e733296e719413a028857fcc3c46ffe440d942f107 Apr 24 17:24:49.294026 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:49.293481 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:24:49.359046 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:49.358958 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wwhr9/must-gather-vhs9z" event={"ID":"9c6f104a-ead6-4069-98d6-eb92870a715a","Type":"ContainerStarted","Data":"9440c66564104db8620455e733296e719413a028857fcc3c46ffe440d942f107"} Apr 24 17:24:55.384891 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:55.384848 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wwhr9/must-gather-vhs9z" event={"ID":"9c6f104a-ead6-4069-98d6-eb92870a715a","Type":"ContainerStarted","Data":"18398afe2a355dcc809aceb0cbe62ba8f037428e6a896094fbbd915ad06357c2"} Apr 24 17:24:55.384891 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:55.384895 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wwhr9/must-gather-vhs9z" event={"ID":"9c6f104a-ead6-4069-98d6-eb92870a715a","Type":"ContainerStarted","Data":"2cb161acb01b970e73b73d61a673ebf98b729764c6f5e1cd394d9b1dbc6dc858"} Apr 24 17:24:55.401409 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:24:55.401345 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wwhr9/must-gather-vhs9z" podStartSLOduration=2.364535267 podStartE2EDuration="7.401330117s" podCreationTimestamp="2026-04-24 17:24:48 +0000 UTC" firstStartedPulling="2026-04-24 17:24:49.293679237 +0000 UTC m=+2724.979403464" lastFinishedPulling="2026-04-24 17:24:54.330474079 +0000 UTC m=+2730.016198314" observedRunningTime="2026-04-24 17:24:55.399646076 +0000 UTC m=+2731.085370323" watchObservedRunningTime="2026-04-24 17:24:55.401330117 +0000 UTC m=+2731.087054361" Apr 24 17:25:13.445030 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:13.444939 2581 generic.go:358] "Generic (PLEG): container finished" podID="9c6f104a-ead6-4069-98d6-eb92870a715a" containerID="2cb161acb01b970e73b73d61a673ebf98b729764c6f5e1cd394d9b1dbc6dc858" exitCode=0 Apr 24 17:25:13.445030 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:13.445011 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wwhr9/must-gather-vhs9z" event={"ID":"9c6f104a-ead6-4069-98d6-eb92870a715a","Type":"ContainerDied","Data":"2cb161acb01b970e73b73d61a673ebf98b729764c6f5e1cd394d9b1dbc6dc858"} Apr 24 17:25:13.445436 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:13.445336 2581 scope.go:117] "RemoveContainer" containerID="2cb161acb01b970e73b73d61a673ebf98b729764c6f5e1cd394d9b1dbc6dc858" Apr 24 17:25:14.095091 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:14.095068 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wwhr9_must-gather-vhs9z_9c6f104a-ead6-4069-98d6-eb92870a715a/gather/0.log" Apr 24 17:25:14.647855 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:14.647823 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f77jl/must-gather-phmpc"] Apr 24 17:25:14.651620 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:14.651598 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f77jl/must-gather-phmpc" Apr 24 17:25:14.654407 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:14.654390 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f77jl\"/\"openshift-service-ca.crt\"" Apr 24 17:25:14.654539 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:14.654391 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-f77jl\"/\"default-dockercfg-fbxhs\"" Apr 24 17:25:14.654945 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:14.654930 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f77jl\"/\"kube-root-ca.crt\"" Apr 24 17:25:14.658682 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:14.658335 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f77jl/must-gather-phmpc"] Apr 24 17:25:14.731125 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:14.731091 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbm9s\" (UniqueName: \"kubernetes.io/projected/95138554-0a1f-4c41-bd8b-09f71ca0a2be-kube-api-access-rbm9s\") pod \"must-gather-phmpc\" (UID: \"95138554-0a1f-4c41-bd8b-09f71ca0a2be\") " pod="openshift-must-gather-f77jl/must-gather-phmpc" Apr 24 17:25:14.731294 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:14.731145 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/95138554-0a1f-4c41-bd8b-09f71ca0a2be-must-gather-output\") pod \"must-gather-phmpc\" (UID: \"95138554-0a1f-4c41-bd8b-09f71ca0a2be\") " pod="openshift-must-gather-f77jl/must-gather-phmpc" Apr 24 17:25:14.832408 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:14.832376 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbm9s\" (UniqueName: \"kubernetes.io/projected/95138554-0a1f-4c41-bd8b-09f71ca0a2be-kube-api-access-rbm9s\") pod \"must-gather-phmpc\" (UID: \"95138554-0a1f-4c41-bd8b-09f71ca0a2be\") " pod="openshift-must-gather-f77jl/must-gather-phmpc" Apr 24 17:25:14.832625 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:14.832423 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/95138554-0a1f-4c41-bd8b-09f71ca0a2be-must-gather-output\") pod \"must-gather-phmpc\" (UID: \"95138554-0a1f-4c41-bd8b-09f71ca0a2be\") " pod="openshift-must-gather-f77jl/must-gather-phmpc" Apr 24 17:25:14.832804 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:14.832783 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/95138554-0a1f-4c41-bd8b-09f71ca0a2be-must-gather-output\") pod \"must-gather-phmpc\" (UID: \"95138554-0a1f-4c41-bd8b-09f71ca0a2be\") " pod="openshift-must-gather-f77jl/must-gather-phmpc" Apr 24 17:25:14.840951 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:14.840928 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbm9s\" (UniqueName: \"kubernetes.io/projected/95138554-0a1f-4c41-bd8b-09f71ca0a2be-kube-api-access-rbm9s\") pod \"must-gather-phmpc\" (UID: \"95138554-0a1f-4c41-bd8b-09f71ca0a2be\") " pod="openshift-must-gather-f77jl/must-gather-phmpc" Apr 24 17:25:14.961332 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:14.961298 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f77jl/must-gather-phmpc" Apr 24 17:25:15.084587 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:15.084350 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f77jl/must-gather-phmpc"] Apr 24 17:25:15.087392 ip-10-0-143-104 kubenswrapper[2581]: W0424 17:25:15.087360 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95138554_0a1f_4c41_bd8b_09f71ca0a2be.slice/crio-2feb84b9a07446cbd7731bd066da4662265b0ce51b790f6520733bc7552ff378 WatchSource:0}: Error finding container 2feb84b9a07446cbd7731bd066da4662265b0ce51b790f6520733bc7552ff378: Status 404 returned error can't find the container with id 2feb84b9a07446cbd7731bd066da4662265b0ce51b790f6520733bc7552ff378 Apr 24 17:25:15.451643 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:15.451552 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f77jl/must-gather-phmpc" event={"ID":"95138554-0a1f-4c41-bd8b-09f71ca0a2be","Type":"ContainerStarted","Data":"2feb84b9a07446cbd7731bd066da4662265b0ce51b790f6520733bc7552ff378"} Apr 24 17:25:17.459933 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:17.459895 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f77jl/must-gather-phmpc" event={"ID":"95138554-0a1f-4c41-bd8b-09f71ca0a2be","Type":"ContainerStarted","Data":"2e474d709dde8c61e88ced80b28461028ee2c6ba50e39f4a5182a9ef1382fe12"} Apr 24 17:25:17.460348 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:17.459940 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f77jl/must-gather-phmpc" event={"ID":"95138554-0a1f-4c41-bd8b-09f71ca0a2be","Type":"ContainerStarted","Data":"884f2d2fd34d852c1bab26304e635001d5f56d047458cc56aa4c1d8e1b835415"} Apr 24 17:25:17.477125 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:17.477047 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f77jl/must-gather-phmpc" podStartSLOduration=2.197926537 podStartE2EDuration="3.477028445s" podCreationTimestamp="2026-04-24 17:25:14 +0000 UTC" firstStartedPulling="2026-04-24 17:25:15.089302254 +0000 UTC m=+2750.775026492" lastFinishedPulling="2026-04-24 17:25:16.368404176 +0000 UTC m=+2752.054128400" observedRunningTime="2026-04-24 17:25:17.47455897 +0000 UTC m=+2753.160283213" watchObservedRunningTime="2026-04-24 17:25:17.477028445 +0000 UTC m=+2753.162752689" Apr 24 17:25:17.800759 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:17.800676 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-592vt_2f7a291a-6e18-499f-83a3-c48bffb2fdec/global-pull-secret-syncer/0.log" Apr 24 17:25:18.045687 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:18.045646 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-m7ftj_0e0b1833-0ea6-4684-8c49-7ad78d75cec2/konnectivity-agent/0.log" Apr 24 17:25:18.114144 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:18.114010 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-104.ec2.internal_acbc91bbcdbc790f59d9cba82c01d807/haproxy/0.log" Apr 24 17:25:19.505437 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:19.505396 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wwhr9/must-gather-vhs9z"] Apr 24 17:25:19.509113 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:19.509086 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wwhr9/must-gather-vhs9z"] Apr 24 17:25:19.509867 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:19.509553 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-wwhr9/must-gather-vhs9z" podUID="9c6f104a-ead6-4069-98d6-eb92870a715a" containerName="copy" containerID="cri-o://18398afe2a355dcc809aceb0cbe62ba8f037428e6a896094fbbd915ad06357c2" gracePeriod=2 Apr 24 17:25:19.511744 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:19.511669 2581 status_manager.go:895] "Failed to get status for pod" podUID="9c6f104a-ead6-4069-98d6-eb92870a715a" pod="openshift-must-gather-wwhr9/must-gather-vhs9z" err="pods \"must-gather-vhs9z\" is forbidden: User \"system:node:ip-10-0-143-104.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wwhr9\": no relationship found between node 'ip-10-0-143-104.ec2.internal' and this object" Apr 24 17:25:19.881710 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:19.880650 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wwhr9_must-gather-vhs9z_9c6f104a-ead6-4069-98d6-eb92870a715a/copy/0.log" Apr 24 17:25:19.881710 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:19.881066 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wwhr9/must-gather-vhs9z" Apr 24 17:25:19.883527 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:19.883008 2581 status_manager.go:895] "Failed to get status for pod" podUID="9c6f104a-ead6-4069-98d6-eb92870a715a" pod="openshift-must-gather-wwhr9/must-gather-vhs9z" err="pods \"must-gather-vhs9z\" is forbidden: User \"system:node:ip-10-0-143-104.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wwhr9\": no relationship found between node 'ip-10-0-143-104.ec2.internal' and this object" Apr 24 17:25:19.987447 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:19.987405 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9c6f104a-ead6-4069-98d6-eb92870a715a-must-gather-output\") pod \"9c6f104a-ead6-4069-98d6-eb92870a715a\" (UID: \"9c6f104a-ead6-4069-98d6-eb92870a715a\") " Apr 24 17:25:19.987655 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:19.987515 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw4dn\" (UniqueName: \"kubernetes.io/projected/9c6f104a-ead6-4069-98d6-eb92870a715a-kube-api-access-qw4dn\") pod \"9c6f104a-ead6-4069-98d6-eb92870a715a\" (UID: \"9c6f104a-ead6-4069-98d6-eb92870a715a\") " Apr 24 17:25:19.989582 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:19.989548 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c6f104a-ead6-4069-98d6-eb92870a715a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9c6f104a-ead6-4069-98d6-eb92870a715a" (UID: "9c6f104a-ead6-4069-98d6-eb92870a715a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:25:19.992672 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:19.992643 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c6f104a-ead6-4069-98d6-eb92870a715a-kube-api-access-qw4dn" (OuterVolumeSpecName: "kube-api-access-qw4dn") pod "9c6f104a-ead6-4069-98d6-eb92870a715a" (UID: "9c6f104a-ead6-4069-98d6-eb92870a715a"). InnerVolumeSpecName "kube-api-access-qw4dn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:25:20.090552 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:20.088321 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qw4dn\" (UniqueName: \"kubernetes.io/projected/9c6f104a-ead6-4069-98d6-eb92870a715a-kube-api-access-qw4dn\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 17:25:20.090552 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:20.088363 2581 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9c6f104a-ead6-4069-98d6-eb92870a715a-must-gather-output\") on node \"ip-10-0-143-104.ec2.internal\" DevicePath \"\"" Apr 24 17:25:20.472912 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:20.472882 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wwhr9_must-gather-vhs9z_9c6f104a-ead6-4069-98d6-eb92870a715a/copy/0.log" Apr 24 17:25:20.473645 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:20.473616 2581 generic.go:358] "Generic (PLEG): container finished" podID="9c6f104a-ead6-4069-98d6-eb92870a715a" containerID="18398afe2a355dcc809aceb0cbe62ba8f037428e6a896094fbbd915ad06357c2" exitCode=143 Apr 24 17:25:20.473936 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:20.473922 2581 scope.go:117] "RemoveContainer" containerID="18398afe2a355dcc809aceb0cbe62ba8f037428e6a896094fbbd915ad06357c2" Apr 24 17:25:20.474171 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:20.474157 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wwhr9/must-gather-vhs9z" Apr 24 17:25:20.486902 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:20.486786 2581 status_manager.go:895] "Failed to get status for pod" podUID="9c6f104a-ead6-4069-98d6-eb92870a715a" pod="openshift-must-gather-wwhr9/must-gather-vhs9z" err="pods \"must-gather-vhs9z\" is forbidden: User \"system:node:ip-10-0-143-104.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wwhr9\": no relationship found between node 'ip-10-0-143-104.ec2.internal' and this object" Apr 24 17:25:20.490972 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:20.490934 2581 status_manager.go:895] "Failed to get status for pod" podUID="9c6f104a-ead6-4069-98d6-eb92870a715a" pod="openshift-must-gather-wwhr9/must-gather-vhs9z" err="pods \"must-gather-vhs9z\" is forbidden: User \"system:node:ip-10-0-143-104.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-wwhr9\": no relationship found between node 'ip-10-0-143-104.ec2.internal' and this object" Apr 24 17:25:20.497010 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:20.496588 2581 scope.go:117] "RemoveContainer" containerID="2cb161acb01b970e73b73d61a673ebf98b729764c6f5e1cd394d9b1dbc6dc858" Apr 24 17:25:20.523618 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:20.522849 2581 scope.go:117] "RemoveContainer" containerID="18398afe2a355dcc809aceb0cbe62ba8f037428e6a896094fbbd915ad06357c2" Apr 24 17:25:20.523618 ip-10-0-143-104 kubenswrapper[2581]: E0424 17:25:20.523214 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18398afe2a355dcc809aceb0cbe62ba8f037428e6a896094fbbd915ad06357c2\": container with ID starting with 18398afe2a355dcc809aceb0cbe62ba8f037428e6a896094fbbd915ad06357c2 not found: ID does not exist" containerID="18398afe2a355dcc809aceb0cbe62ba8f037428e6a896094fbbd915ad06357c2" Apr 24 17:25:20.523618 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:20.523249 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18398afe2a355dcc809aceb0cbe62ba8f037428e6a896094fbbd915ad06357c2"} err="failed to get container status \"18398afe2a355dcc809aceb0cbe62ba8f037428e6a896094fbbd915ad06357c2\": rpc error: code = NotFound desc = could not find container \"18398afe2a355dcc809aceb0cbe62ba8f037428e6a896094fbbd915ad06357c2\": container with ID starting with 18398afe2a355dcc809aceb0cbe62ba8f037428e6a896094fbbd915ad06357c2 not found: ID does not exist" Apr 24 17:25:20.523618 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:20.523278 2581 scope.go:117] "RemoveContainer" containerID="2cb161acb01b970e73b73d61a673ebf98b729764c6f5e1cd394d9b1dbc6dc858" Apr 24 17:25:20.523618 ip-10-0-143-104 kubenswrapper[2581]: E0424 17:25:20.523523 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb161acb01b970e73b73d61a673ebf98b729764c6f5e1cd394d9b1dbc6dc858\": container with ID starting with 2cb161acb01b970e73b73d61a673ebf98b729764c6f5e1cd394d9b1dbc6dc858 not found: ID does not exist" containerID="2cb161acb01b970e73b73d61a673ebf98b729764c6f5e1cd394d9b1dbc6dc858" Apr 24 17:25:20.523618 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:20.523548 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb161acb01b970e73b73d61a673ebf98b729764c6f5e1cd394d9b1dbc6dc858"} err="failed to get container status \"2cb161acb01b970e73b73d61a673ebf98b729764c6f5e1cd394d9b1dbc6dc858\": rpc error: code = NotFound desc = could not find container \"2cb161acb01b970e73b73d61a673ebf98b729764c6f5e1cd394d9b1dbc6dc858\": container with ID starting with 2cb161acb01b970e73b73d61a673ebf98b729764c6f5e1cd394d9b1dbc6dc858 not found: ID does not exist" Apr 24 17:25:20.895972 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:20.895882 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c6f104a-ead6-4069-98d6-eb92870a715a" path="/var/lib/kubelet/pods/9c6f104a-ead6-4069-98d6-eb92870a715a/volumes" Apr 24 17:25:21.477416 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:21.477388 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_90bb1fd8-5247-4ab8-b24f-09042b28a1d1/alertmanager/0.log" Apr 24 17:25:21.501214 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:21.501177 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_90bb1fd8-5247-4ab8-b24f-09042b28a1d1/config-reloader/0.log" Apr 24 17:25:21.525217 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:21.525147 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_90bb1fd8-5247-4ab8-b24f-09042b28a1d1/kube-rbac-proxy-web/0.log" Apr 24 17:25:21.548397 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:21.548364 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_90bb1fd8-5247-4ab8-b24f-09042b28a1d1/kube-rbac-proxy/0.log" Apr 24 17:25:21.572802 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:21.572727 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_90bb1fd8-5247-4ab8-b24f-09042b28a1d1/kube-rbac-proxy-metric/0.log" Apr 24 17:25:21.597212 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:21.597030 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_90bb1fd8-5247-4ab8-b24f-09042b28a1d1/prom-label-proxy/0.log" Apr 24 17:25:21.620815 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:21.620751 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_90bb1fd8-5247-4ab8-b24f-09042b28a1d1/init-config-reloader/0.log" Apr 24 17:25:21.683472 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:21.683435 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-4kcgp_f43bd50f-a7ab-4412-a611-b779465b96fa/kube-state-metrics/0.log" Apr 24 17:25:21.710901 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:21.710798 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-4kcgp_f43bd50f-a7ab-4412-a611-b779465b96fa/kube-rbac-proxy-main/0.log" Apr 24 17:25:21.742908 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:21.742827 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-4kcgp_f43bd50f-a7ab-4412-a611-b779465b96fa/kube-rbac-proxy-self/0.log" Apr 24 17:25:21.776962 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:21.776931 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-67859b89cc-28g4s_0e9cacec-d9bc-49d8-809b-54a40842359c/metrics-server/0.log" Apr 24 17:25:21.905187 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:21.905155 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9c8tw_8e4128aa-6993-4ee6-a68e-8f79e6b7bece/node-exporter/0.log" Apr 24 17:25:21.927960 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:21.927931 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9c8tw_8e4128aa-6993-4ee6-a68e-8f79e6b7bece/kube-rbac-proxy/0.log" Apr 24 17:25:21.952662 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:21.952632 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9c8tw_8e4128aa-6993-4ee6-a68e-8f79e6b7bece/init-textfile/0.log" Apr 24 17:25:22.059634 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.059536 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lxnf9_f1d225ba-df3d-4ffa-88a0-edb91a39eb75/kube-rbac-proxy-main/0.log" Apr 24 17:25:22.080975 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.080944 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lxnf9_f1d225ba-df3d-4ffa-88a0-edb91a39eb75/kube-rbac-proxy-self/0.log" Apr 24 17:25:22.103091 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.103057 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lxnf9_f1d225ba-df3d-4ffa-88a0-edb91a39eb75/openshift-state-metrics/0.log" Apr 24 17:25:22.149428 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.149400 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f5274be0-cad9-4355-bcd9-8ca6089639d0/prometheus/0.log" Apr 24 17:25:22.166259 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.166232 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f5274be0-cad9-4355-bcd9-8ca6089639d0/config-reloader/0.log" Apr 24 17:25:22.190523 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.190476 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f5274be0-cad9-4355-bcd9-8ca6089639d0/thanos-sidecar/0.log" Apr 24 17:25:22.211945 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.211912 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f5274be0-cad9-4355-bcd9-8ca6089639d0/kube-rbac-proxy-web/0.log" Apr 24 17:25:22.236282 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.236256 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f5274be0-cad9-4355-bcd9-8ca6089639d0/kube-rbac-proxy/0.log" Apr 24 17:25:22.258839 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.258811 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f5274be0-cad9-4355-bcd9-8ca6089639d0/kube-rbac-proxy-thanos/0.log" Apr 24 17:25:22.281687 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.281648 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f5274be0-cad9-4355-bcd9-8ca6089639d0/init-config-reloader/0.log" Apr 24 17:25:22.313201 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.313111 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-d8xhc_cf96c251-c45d-4d4c-a5f7-928308adfb3a/prometheus-operator/0.log" Apr 24 17:25:22.334168 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.334139 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-d8xhc_cf96c251-c45d-4d4c-a5f7-928308adfb3a/kube-rbac-proxy/0.log" Apr 24 17:25:22.359448 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.359398 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-g6gp8_b55d60c2-646a-407e-9882-06fac4ac6678/prometheus-operator-admission-webhook/0.log" Apr 24 17:25:22.386819 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.386791 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6b649c869d-lvprs_61283d35-0e23-4324-b440-b03eba0c9a0f/telemeter-client/0.log" Apr 24 17:25:22.408632 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.408587 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6b649c869d-lvprs_61283d35-0e23-4324-b440-b03eba0c9a0f/reload/0.log" Apr 24 17:25:22.431942 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.431913 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6b649c869d-lvprs_61283d35-0e23-4324-b440-b03eba0c9a0f/kube-rbac-proxy/0.log" Apr 24 17:25:22.460932 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.460892 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-68b58d564c-vmd2m_425a7976-5d3c-47ca-9644-a15892195c45/thanos-query/0.log" Apr 24 17:25:22.484949 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.484914 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-68b58d564c-vmd2m_425a7976-5d3c-47ca-9644-a15892195c45/kube-rbac-proxy-web/0.log" Apr 24 17:25:22.509547 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.509516 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-68b58d564c-vmd2m_425a7976-5d3c-47ca-9644-a15892195c45/kube-rbac-proxy/0.log" Apr 24 17:25:22.532861 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.532834 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-68b58d564c-vmd2m_425a7976-5d3c-47ca-9644-a15892195c45/prom-label-proxy/0.log" Apr 24 17:25:22.553853 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.553818 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-68b58d564c-vmd2m_425a7976-5d3c-47ca-9644-a15892195c45/kube-rbac-proxy-rules/0.log" Apr 24 17:25:22.580027 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:22.579947 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-68b58d564c-vmd2m_425a7976-5d3c-47ca-9644-a15892195c45/kube-rbac-proxy-metrics/0.log" Apr 24 17:25:24.569216 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:24.569189 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b898679b9-52l5m_a1768070-3ece-4943-898e-d6ec601b7510/console/0.log" Apr 24 17:25:25.285630 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.285550 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc"] Apr 24 17:25:25.285974 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.285958 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c6f104a-ead6-4069-98d6-eb92870a715a" containerName="gather" Apr 24 17:25:25.285974 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.285976 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c6f104a-ead6-4069-98d6-eb92870a715a" containerName="gather" Apr 24 17:25:25.286087 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.285989 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c6f104a-ead6-4069-98d6-eb92870a715a" containerName="copy" Apr 24 17:25:25.286087 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.285995 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c6f104a-ead6-4069-98d6-eb92870a715a" containerName="copy" Apr 24 17:25:25.286087 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.286054 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c6f104a-ead6-4069-98d6-eb92870a715a" containerName="gather" Apr 24 17:25:25.286087 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.286062 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c6f104a-ead6-4069-98d6-eb92870a715a" containerName="copy" Apr 24 17:25:25.290343 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.290310 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:25.295714 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.295683 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc"] Apr 24 17:25:25.453862 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.453828 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ed186e52-c225-4d26-b4a3-013a1ff9253d-lib-modules\") pod \"perf-node-gather-daemonset-fvzdc\" (UID: \"ed186e52-c225-4d26-b4a3-013a1ff9253d\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:25.454072 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.453875 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sjl7\" (UniqueName: \"kubernetes.io/projected/ed186e52-c225-4d26-b4a3-013a1ff9253d-kube-api-access-6sjl7\") pod \"perf-node-gather-daemonset-fvzdc\" (UID: \"ed186e52-c225-4d26-b4a3-013a1ff9253d\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:25.454072 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.453904 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed186e52-c225-4d26-b4a3-013a1ff9253d-sys\") pod \"perf-node-gather-daemonset-fvzdc\" (UID: \"ed186e52-c225-4d26-b4a3-013a1ff9253d\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:25.454072 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.453935 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ed186e52-c225-4d26-b4a3-013a1ff9253d-proc\") pod \"perf-node-gather-daemonset-fvzdc\" (UID: \"ed186e52-c225-4d26-b4a3-013a1ff9253d\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:25.454072 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.453969 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ed186e52-c225-4d26-b4a3-013a1ff9253d-podres\") pod \"perf-node-gather-daemonset-fvzdc\" (UID: \"ed186e52-c225-4d26-b4a3-013a1ff9253d\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:25.554579 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.554471 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ed186e52-c225-4d26-b4a3-013a1ff9253d-lib-modules\") pod \"perf-node-gather-daemonset-fvzdc\" (UID: \"ed186e52-c225-4d26-b4a3-013a1ff9253d\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:25.555046 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.555016 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6sjl7\" (UniqueName: \"kubernetes.io/projected/ed186e52-c225-4d26-b4a3-013a1ff9253d-kube-api-access-6sjl7\") pod \"perf-node-gather-daemonset-fvzdc\" (UID: \"ed186e52-c225-4d26-b4a3-013a1ff9253d\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:25.555131 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.554924 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ed186e52-c225-4d26-b4a3-013a1ff9253d-lib-modules\") pod \"perf-node-gather-daemonset-fvzdc\" (UID: \"ed186e52-c225-4d26-b4a3-013a1ff9253d\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:25.555181 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.555161 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed186e52-c225-4d26-b4a3-013a1ff9253d-sys\") pod \"perf-node-gather-daemonset-fvzdc\" (UID: \"ed186e52-c225-4d26-b4a3-013a1ff9253d\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:25.555216 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.555197 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ed186e52-c225-4d26-b4a3-013a1ff9253d-proc\") pod \"perf-node-gather-daemonset-fvzdc\" (UID: \"ed186e52-c225-4d26-b4a3-013a1ff9253d\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:25.555250 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.555230 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ed186e52-c225-4d26-b4a3-013a1ff9253d-podres\") pod \"perf-node-gather-daemonset-fvzdc\" (UID: \"ed186e52-c225-4d26-b4a3-013a1ff9253d\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:25.555284 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.555255 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed186e52-c225-4d26-b4a3-013a1ff9253d-sys\") pod \"perf-node-gather-daemonset-fvzdc\" (UID: \"ed186e52-c225-4d26-b4a3-013a1ff9253d\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:25.555363 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.555349 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ed186e52-c225-4d26-b4a3-013a1ff9253d-proc\") pod \"perf-node-gather-daemonset-fvzdc\" (UID: \"ed186e52-c225-4d26-b4a3-013a1ff9253d\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:25.555420 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.555406 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ed186e52-c225-4d26-b4a3-013a1ff9253d-podres\") pod \"perf-node-gather-daemonset-fvzdc\" (UID: \"ed186e52-c225-4d26-b4a3-013a1ff9253d\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:25.563561 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.563533 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sjl7\" (UniqueName: \"kubernetes.io/projected/ed186e52-c225-4d26-b4a3-013a1ff9253d-kube-api-access-6sjl7\") pod \"perf-node-gather-daemonset-fvzdc\" (UID: \"ed186e52-c225-4d26-b4a3-013a1ff9253d\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:25.604637 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.604596 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:25.738418 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.738392 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc"] Apr 24 17:25:25.741156 ip-10-0-143-104 kubenswrapper[2581]: W0424 17:25:25.741127 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poded186e52_c225_4d26_b4a3_013a1ff9253d.slice/crio-1009f0e371348136113e933a653120039bc0e355a6f5cdc65be961d606ba0940 WatchSource:0}: Error finding container 1009f0e371348136113e933a653120039bc0e355a6f5cdc65be961d606ba0940: Status 404 returned error can't find the container with id 1009f0e371348136113e933a653120039bc0e355a6f5cdc65be961d606ba0940 Apr 24 17:25:25.809372 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.809316 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p2t5q_bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769/dns/0.log" Apr 24 17:25:25.834554 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.834532 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p2t5q_bf8fd8e0-4aef-49f4-9fdb-98fb64fb7769/kube-rbac-proxy/0.log" Apr 24 17:25:25.934561 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:25.934531 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-z9v4l_a13ed17d-5b83-44be-8c88-f98632b2ac89/dns-node-resolver/0.log" Apr 24 17:25:26.400125 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:26.400100 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vcjb9_f3323ce5-5f82-4b32-8290-e8a47d64634b/node-ca/0.log" Apr 24 17:25:26.505392 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:26.505357 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" event={"ID":"ed186e52-c225-4d26-b4a3-013a1ff9253d","Type":"ContainerStarted","Data":"9746a442f02cd864ce1623f147cbf36be7c5768b0b0044438ca522904d5def06"} Apr 24 17:25:26.505392 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:26.505397 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" event={"ID":"ed186e52-c225-4d26-b4a3-013a1ff9253d","Type":"ContainerStarted","Data":"1009f0e371348136113e933a653120039bc0e355a6f5cdc65be961d606ba0940"} Apr 24 17:25:26.505689 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:26.505477 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:26.521565 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:26.521476 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" podStartSLOduration=1.52145631 podStartE2EDuration="1.52145631s" podCreationTimestamp="2026-04-24 17:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:25:26.521318974 +0000 UTC m=+2762.207043219" watchObservedRunningTime="2026-04-24 17:25:26.52145631 +0000 UTC m=+2762.207180555" Apr 24 17:25:27.555591 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:27.555548 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6nvzh_7cc72beb-42b6-42d1-a909-660b999f008c/serve-healthcheck-canary/0.log" Apr 24 17:25:27.966403 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:27.966374 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7smgx_7bbec4d6-1e27-45e3-ae7f-fe2976a2de07/kube-rbac-proxy/0.log" Apr 24 17:25:27.985799 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:27.985776 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7smgx_7bbec4d6-1e27-45e3-ae7f-fe2976a2de07/exporter/0.log" Apr 24 17:25:28.006339 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:28.006312 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7smgx_7bbec4d6-1e27-45e3-ae7f-fe2976a2de07/extractor/0.log" Apr 24 17:25:30.109451 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:30.109425 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-w2nx7_f3ea16ba-f3a9-4183-b7a4-b080b0a08438/manager/0.log" Apr 24 17:25:30.577458 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:30.577430 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-jlt9x_d91a6800-a301-4363-b2e0-90a3d90c9242/s3-init/0.log" Apr 24 17:25:32.522417 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:32.521462 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-fvzdc" Apr 24 17:25:36.015902 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:36.015871 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8vprt_d9f551fe-0d59-471b-b35c-3abef14bb13f/kube-multus/0.log" Apr 24 17:25:36.041009 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:36.040978 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7xj6k_a611eeef-0446-421a-b3c5-d38e773087f7/kube-multus-additional-cni-plugins/0.log" Apr 24 17:25:36.064709 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:36.064678 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7xj6k_a611eeef-0446-421a-b3c5-d38e773087f7/egress-router-binary-copy/0.log" Apr 24 17:25:36.085082 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:36.085047 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7xj6k_a611eeef-0446-421a-b3c5-d38e773087f7/cni-plugins/0.log" Apr 24 17:25:36.106062 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:36.106034 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7xj6k_a611eeef-0446-421a-b3c5-d38e773087f7/bond-cni-plugin/0.log" Apr 24 17:25:36.126432 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:36.126400 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7xj6k_a611eeef-0446-421a-b3c5-d38e773087f7/routeoverride-cni/0.log" Apr 24 17:25:36.149356 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:36.149330 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7xj6k_a611eeef-0446-421a-b3c5-d38e773087f7/whereabouts-cni-bincopy/0.log" Apr 24 17:25:36.170202 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:36.170176 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7xj6k_a611eeef-0446-421a-b3c5-d38e773087f7/whereabouts-cni/0.log" Apr 24 17:25:36.607908 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:36.607875 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-q5b2h_d85b39e7-4145-4783-a50d-e94999b43e90/network-metrics-daemon/0.log" Apr 24 17:25:36.627227 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:36.627202 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-q5b2h_d85b39e7-4145-4783-a50d-e94999b43e90/kube-rbac-proxy/0.log" Apr 24 17:25:37.402138 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:37.402104 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-controller/0.log" Apr 24 17:25:37.425539 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:37.425494 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/0.log" Apr 24 17:25:37.437100 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:37.437067 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovn-acl-logging/1.log" Apr 24 17:25:37.456335 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:37.456300 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/kube-rbac-proxy-node/0.log" Apr 24 17:25:37.480545 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:37.480519 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 17:25:37.503015 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:37.502993 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/northd/0.log" Apr 24 17:25:37.529233 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:37.529204 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/nbdb/0.log" Apr 24 17:25:37.551583 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:37.551556 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/sbdb/0.log" Apr 24 17:25:37.658692 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:37.658616 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k59gs_0048dae9-a5eb-4707-9a78-5385f148fdf1/ovnkube-controller/0.log" Apr 24 17:25:39.380963 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:39.380932 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-9wjxs_4fda4ceb-5ea7-4202-903b-a9a5b5152485/network-check-target-container/0.log" Apr 24 17:25:40.330917 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:40.330888 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-9ldxt_8a2acb5b-dd6f-415a-a081-ae20b03878ff/iptables-alerter/0.log" Apr 24 17:25:40.995318 ip-10-0-143-104 kubenswrapper[2581]: I0424 17:25:40.995285 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-rx464_f9857732-ae10-4c66-8e34-589690779e84/tuned/0.log"