Mar 18 16:44:19.344235 ip-10-0-139-43 systemd[1]: Starting Kubernetes Kubelet... Mar 18 16:44:19.840912 ip-10-0-139-43 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:19.840912 ip-10-0-139-43 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 16:44:19.840912 ip-10-0-139-43 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:19.840912 ip-10-0-139-43 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 18 16:44:19.840912 ip-10-0-139-43 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:19.844141 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.844041 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 16:44:19.850440 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850418 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:19.850440 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850436 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:19.850440 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850440 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:19.850440 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850443 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:19.850440 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850446 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:19.850440 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850450 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850453 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850455 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850460 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850463 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850466 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850469 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850472 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850475 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850478 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850480 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850483 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850485 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850488 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850491 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850493 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850497 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850499 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850502 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:19.850656 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850505 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850507 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850510 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850516 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850519 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850522 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850524 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850527 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850529 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850531 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850534 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850537 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850540 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850543 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850546 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850549 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850551 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850554 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850556 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:19.851129 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850559 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850562 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850564 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850567 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850570 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850572 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850575 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850577 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850580 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850582 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850585 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850587 2571 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850590 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850592 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850595 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850598 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850600 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850603 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850605 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850607 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:19.851601 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850610 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850613 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850616 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850619 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850621 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850624 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850627 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850629 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850632 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850634 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850637 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850640 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850642 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850645 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850649 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850652 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850655 2571 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850657 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850661 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:19.852085 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850665 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850668 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850671 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.850673 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851068 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851073 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851076 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851079 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851081 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851084 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851087 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851090 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851093 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851096 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851099 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851101 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851104 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851106 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851109 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851112 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:19.852548 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851114 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851117 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851119 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851122 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851126 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851130 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851133 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851136 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851139 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851142 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851144 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851147 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851149 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851152 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851154 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851157 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851159 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851162 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851165 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:19.853071 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851167 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851169 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851172 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851175 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851177 2571 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851180 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851184 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851187 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851190 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851193 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851196 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851198 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851201 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851203 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851206 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851208 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851211 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851213 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851216 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851219 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:19.853532 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851221 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851224 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851228 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851231 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851234 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851236 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851239 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851241 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851244 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851246 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851249 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851251 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851254 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851256 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851259 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851261 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851265 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851267 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851270 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:19.854036 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851273 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851275 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851278 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851280 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851283 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851285 2571 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851288 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851290 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851292 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851295 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851297 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.851300 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851376 2571 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851386 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851395 2571 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851401 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851408 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851413 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851420 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851430 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851434 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 16:44:19.854509 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851437 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851440 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851447 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851450 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851454 2571 flags.go:64] FLAG: --cgroup-root="" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851456 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851459 2571 flags.go:64] FLAG: --client-ca-file="" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851462 2571 flags.go:64] FLAG: --cloud-config="" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851465 2571 flags.go:64] FLAG: --cloud-provider="external" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851468 2571 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851472 2571 flags.go:64] FLAG: --cluster-domain="" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851475 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851478 2571 flags.go:64] FLAG: --config-dir="" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851481 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851484 2571 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851488 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851491 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851494 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851497 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851500 2571 flags.go:64] FLAG: --contention-profiling="false" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851503 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851506 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851509 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851512 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851516 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 16:44:19.855019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851519 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851522 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851525 2571 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851529 2571 flags.go:64] FLAG: --enable-server="true" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851532 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851536 2571 flags.go:64] FLAG: --event-burst="100" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851539 2571 flags.go:64] FLAG: --event-qps="50" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851542 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851545 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851550 2571 flags.go:64] FLAG: --eviction-hard="" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851554 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851557 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851560 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851563 2571 flags.go:64] FLAG: --eviction-soft="" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851566 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851568 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851571 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851574 2571 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851577 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851580 2571 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851582 2571 flags.go:64] FLAG: --feature-gates="" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851586 2571 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851589 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851592 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851596 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851599 2571 flags.go:64] FLAG: --healthz-port="10248" Mar 18 16:44:19.855628 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851602 2571 flags.go:64] FLAG: --help="false" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851605 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-139-43.ec2.internal" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851608 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851611 2571 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851613 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851617 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851620 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851623 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851626 2571 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851629 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851632 2571 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851636 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851639 2571 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851641 2571 flags.go:64] FLAG: --kube-reserved="" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851644 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851648 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851651 2571 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851654 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851657 2571 flags.go:64] FLAG: --lock-file="" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851659 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851662 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851665 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851670 2571 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 16:44:19.856259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851673 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851676 2571 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851679 2571 flags.go:64] FLAG: --logging-format="text" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851681 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851685 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851688 2571 flags.go:64] FLAG: --manifest-url="" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851690 2571 flags.go:64] FLAG: --manifest-url-header="" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851695 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851698 2571 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851702 2571 flags.go:64] FLAG: --max-pods="110" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851705 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851708 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851711 2571 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851713 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851716 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851719 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851722 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851730 2571 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851733 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851736 2571 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851740 2571 flags.go:64] FLAG: --pod-cidr="" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851743 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b3115b2610585407ab0742648cfbe39c72f57482889f0e778f5ac6fdc482217b" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851749 2571 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851752 2571 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 16:44:19.856801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851755 2571 flags.go:64] FLAG: --pods-per-core="0" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851760 2571 flags.go:64] FLAG: --port="10250" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851763 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851766 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a39e8a8d00511d82" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851769 2571 flags.go:64] FLAG: --qos-reserved="" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851772 2571 flags.go:64] FLAG: --read-only-port="10255" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851774 2571 flags.go:64] FLAG: --register-node="true" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851777 2571 flags.go:64] FLAG: --register-schedulable="true" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851780 2571 flags.go:64] FLAG: --register-with-taints="" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851784 2571 flags.go:64] FLAG: --registry-burst="10" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851786 2571 flags.go:64] FLAG: --registry-qps="5" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851789 2571 flags.go:64] FLAG: --reserved-cpus="" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851792 2571 flags.go:64] FLAG: --reserved-memory="" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851796 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851798 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851801 2571 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851804 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851807 2571 flags.go:64] FLAG: --runonce="false" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851809 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851812 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851815 2571 flags.go:64] FLAG: --seccomp-default="false" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851818 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851820 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851823 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851826 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851829 2571 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 16:44:19.857407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851832 2571 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851835 2571 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851837 2571 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851840 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851843 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851846 2571 flags.go:64] FLAG: --system-cgroups="" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851849 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851856 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851859 2571 flags.go:64] FLAG: --tls-cert-file="" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851862 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851866 2571 flags.go:64] FLAG: --tls-min-version="" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851869 2571 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851872 2571 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851874 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851877 2571 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851880 2571 flags.go:64] FLAG: --v="2" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851884 2571 flags.go:64] FLAG: --version="false" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851888 2571 flags.go:64] FLAG: --vmodule="" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851893 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.851896 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852003 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852007 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852010 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852014 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:19.858094 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852016 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852019 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852022 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852024 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852027 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852029 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852032 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852035 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852037 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852040 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852042 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852045 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852048 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852051 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852053 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852057 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852060 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852062 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852065 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852067 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:19.858710 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852069 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852072 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852075 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852077 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852080 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852082 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852085 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852087 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852090 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852093 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852096 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852099 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852102 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852104 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852107 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852109 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852112 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852115 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852117 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852120 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:19.859609 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852122 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852124 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852127 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852130 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852132 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852136 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852138 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852142 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852145 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852147 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852150 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852152 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852155 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852157 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852159 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852162 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852166 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852170 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852174 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:19.860439 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852177 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852180 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852183 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852187 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852190 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852193 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852195 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852198 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852201 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852204 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852206 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852209 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852212 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852215 2571 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852218 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852220 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852223 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852225 2571 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852228 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852231 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:19.861037 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852234 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:19.861634 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852237 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:19.861634 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.852239 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:19.861634 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.853033 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:19.861634 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.861396 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 18 16:44:19.861634 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.861418 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 16:44:19.861634 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861518 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:19.861634 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861527 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:19.861634 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861532 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:19.861634 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861536 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:19.861634 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861541 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:19.861634 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861546 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:19.861634 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861550 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:19.861634 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861556 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:19.861634 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861560 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:19.861634 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861565 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:19.861634 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861569 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861573 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861578 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861582 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861586 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861590 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861595 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861599 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861604 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861608 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861612 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861617 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861621 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861625 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861629 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861635 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861639 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861643 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861648 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:19.862366 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861652 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861656 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861660 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861664 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861669 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861674 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861678 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861682 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861689 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861696 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861701 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861707 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861712 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861716 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861720 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861725 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861729 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861733 2571 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861740 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:19.862949 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861746 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861751 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861756 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861761 2571 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861765 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861769 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861774 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861778 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861782 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861786 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861791 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861796 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861800 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861804 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861809 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861814 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861818 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861823 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861827 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861831 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:19.863449 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861835 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:19.863963 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861839 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:19.863963 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861843 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:19.863963 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861848 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:19.863963 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861852 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:19.863963 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861857 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:19.863963 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861861 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:19.863963 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861865 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:19.863963 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861869 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:19.863963 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861874 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:19.863963 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861878 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:19.863963 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861882 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:19.863963 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861886 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:19.863963 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861890 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:19.863963 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861895 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:19.863963 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861899 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:19.863963 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861903 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:19.863963 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.861908 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:19.864667 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.861916 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:19.864667 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862088 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:19.864667 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862097 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:19.864667 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862102 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:19.864667 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862106 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:19.864667 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862112 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:19.864667 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862117 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:19.864667 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862122 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:19.864667 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862127 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:19.864667 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862131 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:19.864667 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862136 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:19.864667 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862140 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:19.864667 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862144 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:19.864667 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862151 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:19.864667 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862171 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:19.864667 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862176 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862181 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862185 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862190 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862194 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862199 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862203 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862207 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862211 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862216 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862220 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862224 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862228 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862232 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862236 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862240 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862245 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862249 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862253 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862257 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:19.865377 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862261 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862265 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862269 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862274 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862279 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862283 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862287 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862291 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862297 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862303 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862308 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862312 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862316 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862320 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862324 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862328 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862332 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862336 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862340 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862344 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:19.865993 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862348 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862352 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862356 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862360 2571 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862364 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862368 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862372 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862376 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862380 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862384 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862388 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862392 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862396 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862400 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862404 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862408 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862413 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862417 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862421 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:19.866614 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862425 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:19.867101 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862429 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:19.867101 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862433 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:19.867101 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862437 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:19.867101 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862440 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:19.867101 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862444 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:19.867101 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862448 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:19.867101 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862452 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:19.867101 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862456 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:19.867101 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862461 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:19.867101 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862465 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:19.867101 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862469 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:19.867101 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:19.862473 2571 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:19.867101 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.862481 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:19.867101 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.863407 2571 server.go:962] "Client rotation is on, will bootstrap in background" Mar 18 16:44:19.867101 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.866330 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 16:44:19.867487 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.867474 2571 server.go:1019] "Starting client certificate rotation" Mar 18 16:44:19.867589 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.867570 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:44:19.867627 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.867607 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:44:19.897115 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.897086 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:44:19.899806 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.899782 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:44:19.920216 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.920189 2571 log.go:25] "Validated CRI v1 runtime API" Mar 18 16:44:19.926023 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.926008 2571 log.go:25] "Validated CRI v1 image API" Mar 18 16:44:19.927258 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.927245 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 16:44:19.927343 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.927328 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:44:19.931672 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.931645 2571 fs.go:135] Filesystem UUIDs: map[2eaafc26-b032-4ae2-adb2-d7f8f06a55a1:/dev/nvme0n1p3 38e31df5-08ca-4431-ba5d-086a5ea384a1:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Mar 18 16:44:19.931718 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.931673 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Mar 18 16:44:19.937742 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.937629 2571 manager.go:217] Machine: {Timestamp:2026-03-18 16:44:19.935632948 +0000 UTC m=+0.465506131 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3140261 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2804e0a1cb3c89ebbe76e6a8f988c9 SystemUUID:ec2804e0-a1cb-3c89-ebbe-76e6a8f988c9 BootID:6683baef-a717-427a-bb58-52978b0854dd Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6094848 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:da:e4:75:26:dd Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:da:e4:75:26:dd Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0e:00:47:60:69:5f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 16:44:19.937742 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.937739 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 16:44:19.937842 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.937823 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.96.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260303-1 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 16:44:19.939913 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.939891 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 16:44:19.940076 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.939915 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-43.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 16:44:19.940122 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.940086 2571 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 16:44:19.940122 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.940095 2571 container_manager_linux.go:306] "Creating device plugin manager" Mar 18 16:44:19.940122 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.940108 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:44:19.940201 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.940129 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:44:19.942383 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.942372 2571 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:44:19.942506 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.942497 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Mar 18 16:44:19.945257 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.945248 2571 kubelet.go:491] "Attempting to sync node with API server" Mar 18 16:44:19.945292 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.945267 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 16:44:19.945292 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.945290 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 16:44:19.945363 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.945300 2571 kubelet.go:397] "Adding apiserver pod source" Mar 18 16:44:19.945363 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.945326 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 16:44:19.946457 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.946443 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:44:19.946522 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.946469 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:44:19.947876 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.947857 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f95xw" Mar 18 16:44:19.949604 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.949590 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.9-3.rhaos4.20.gitb9ac835.el9" apiVersion="v1" Mar 18 16:44:19.953438 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.953411 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 18 16:44:19.955939 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:19.955916 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 18 16:44:19.956005 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:19.955916 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-43.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 18 16:44:19.956166 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.956150 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 16:44:19.956467 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.956427 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 16:44:19.956675 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.956665 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 16:44:19.956712 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.956679 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 16:44:19.956712 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.956689 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 16:44:19.956712 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.956699 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 16:44:19.956712 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.956708 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 16:44:19.956815 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.956717 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 16:44:19.956815 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.956740 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 16:44:19.956815 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.956760 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 16:44:19.956815 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.956779 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 16:44:19.956815 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.956799 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 16:44:19.957734 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.957718 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 16:44:19.957734 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.957735 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Mar 18 16:44:19.958632 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.958615 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f95xw" Mar 18 16:44:19.961790 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.961773 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 18 16:44:19.961860 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.961853 2571 server.go:1295] "Started kubelet" Mar 18 16:44:19.961992 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.961949 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 16:44:19.962054 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.961947 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 16:44:19.962054 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.962034 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 18 16:44:19.962823 ip-10-0-139-43 systemd[1]: Started Kubernetes Kubelet. Mar 18 16:44:19.963359 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.963265 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 16:44:19.964589 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.964571 2571 server.go:317] "Adding debug handlers to kubelet server" Mar 18 16:44:19.969949 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.969932 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Mar 18 16:44:19.970523 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.970511 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 16:44:19.971051 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:19.971027 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 18 16:44:19.971118 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.971109 2571 factory.go:55] Registering systemd factory Mar 18 16:44:19.971151 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.971121 2571 factory.go:223] Registration of the systemd container factory successfully Mar 18 16:44:19.971311 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.971299 2571 factory.go:153] Registering CRI-O factory Mar 18 16:44:19.971358 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.971315 2571 factory.go:223] Registration of the crio container factory successfully Mar 18 16:44:19.971387 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.971381 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 16:44:19.971432 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.971420 2571 factory.go:103] Registering Raw factory Mar 18 16:44:19.971460 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.971451 2571 manager.go:1196] Started watching for new ooms in manager Mar 18 16:44:19.971615 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.971601 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 18 16:44:19.971662 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.971603 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Mar 18 16:44:19.971662 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.971631 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 18 16:44:19.971765 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:19.971742 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-43.ec2.internal\" not found" Mar 18 16:44:19.971798 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.971785 2571 reconstruct.go:97] "Volume reconstruction finished" Mar 18 16:44:19.971844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.971799 2571 reconciler.go:26] "Reconciler: start to sync state" Mar 18 16:44:19.972342 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.972326 2571 manager.go:319] Starting recovery of all containers Mar 18 16:44:19.973618 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.973596 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:19.974041 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.974022 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-43.ec2.internal" not found Mar 18 16:44:19.976392 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:19.976373 2571 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-43.ec2.internal\" not found" node="ip-10-0-139-43.ec2.internal" Mar 18 16:44:19.983714 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.983558 2571 manager.go:324] Recovery completed Mar 18 16:44:19.988768 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.988755 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:19.990131 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.990118 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-43.ec2.internal" not found Mar 18 16:44:19.991872 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.991857 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:19.991941 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.991890 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:19.991941 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.991904 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:19.992363 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.992350 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Mar 18 16:44:19.992363 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.992361 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Mar 18 16:44:19.992445 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.992377 2571 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:44:19.994119 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.994106 2571 policy_none.go:49] "None policy: Start" Mar 18 16:44:19.994164 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.994122 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 18 16:44:19.994164 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:19.994134 2571 state_mem.go:35] "Initializing new in-memory state store" Mar 18 16:44:20.028578 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.028564 2571 manager.go:341] "Starting Device Plugin manager" Mar 18 16:44:20.048576 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:20.028594 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 18 16:44:20.048576 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.028604 2571 server.go:85] "Starting device plugin registration server" Mar 18 16:44:20.048576 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.028856 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 16:44:20.048576 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.028868 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 16:44:20.048576 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.029002 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 16:44:20.048576 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.029116 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 16:44:20.048576 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.029125 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 16:44:20.048576 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:20.029945 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Mar 18 16:44:20.048576 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:20.030010 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-43.ec2.internal\" not found" Mar 18 16:44:20.048576 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.046948 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-43.ec2.internal" not found Mar 18 16:44:20.119642 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.119562 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 18 16:44:20.120763 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.120746 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 18 16:44:20.120825 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.120778 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 18 16:44:20.120825 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.120806 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 18 16:44:20.120825 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.120815 2571 kubelet.go:2451] "Starting kubelet main sync loop" Mar 18 16:44:20.120935 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:20.120856 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 18 16:44:20.125557 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.125540 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:20.129817 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.129806 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:20.130644 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.130628 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:20.130718 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.130658 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:20.130718 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.130669 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:20.130718 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.130696 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-43.ec2.internal" Mar 18 16:44:20.140039 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.140024 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-43.ec2.internal" Mar 18 16:44:20.140092 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:20.140045 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-43.ec2.internal\": node \"ip-10-0-139-43.ec2.internal\" not found" Mar 18 16:44:20.156833 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:20.156812 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-43.ec2.internal\" not found" Mar 18 16:44:20.221760 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.221730 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-43.ec2.internal"] Mar 18 16:44:20.221847 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.221832 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:20.223145 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.223131 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:20.223204 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.223161 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:20.223204 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.223170 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:20.224659 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.224648 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:20.225678 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.225649 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:20.225751 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.225697 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:20.225751 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.225712 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:20.225751 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.225737 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal" Mar 18 16:44:20.225866 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.225767 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:20.226798 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.226786 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:20.226847 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.226809 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:20.226847 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.226820 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:20.227906 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.227892 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-43.ec2.internal" Mar 18 16:44:20.227967 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.227916 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:20.229770 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.229751 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:20.229850 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.229780 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:20.229850 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.229794 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:20.250603 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:20.250586 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-43.ec2.internal\" not found" node="ip-10-0-139-43.ec2.internal" Mar 18 16:44:20.254882 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:20.254865 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-43.ec2.internal\" not found" node="ip-10-0-139-43.ec2.internal" Mar 18 16:44:20.257344 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:20.257329 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-43.ec2.internal\" not found" Mar 18 16:44:20.358427 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:20.358387 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-43.ec2.internal\" not found" Mar 18 16:44:20.373805 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.373744 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e3bdf790e513236fc421e525c4ca1df-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal\" (UID: \"2e3bdf790e513236fc421e525c4ca1df\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal" Mar 18 16:44:20.373912 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.373812 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cf17cac2b800afcc6c2fd3a4eb5e5a86-config\") pod \"kube-apiserver-proxy-ip-10-0-139-43.ec2.internal\" (UID: \"cf17cac2b800afcc6c2fd3a4eb5e5a86\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-43.ec2.internal" Mar 18 16:44:20.373912 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.373890 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2e3bdf790e513236fc421e525c4ca1df-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal\" (UID: \"2e3bdf790e513236fc421e525c4ca1df\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal" Mar 18 16:44:20.459208 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:20.459166 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-43.ec2.internal\" not found" Mar 18 16:44:20.474595 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.474572 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2e3bdf790e513236fc421e525c4ca1df-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal\" (UID: \"2e3bdf790e513236fc421e525c4ca1df\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal" Mar 18 16:44:20.474680 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.474605 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e3bdf790e513236fc421e525c4ca1df-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal\" (UID: \"2e3bdf790e513236fc421e525c4ca1df\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal" Mar 18 16:44:20.474680 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.474631 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cf17cac2b800afcc6c2fd3a4eb5e5a86-config\") pod \"kube-apiserver-proxy-ip-10-0-139-43.ec2.internal\" (UID: \"cf17cac2b800afcc6c2fd3a4eb5e5a86\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-43.ec2.internal" Mar 18 16:44:20.474763 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.474675 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2e3bdf790e513236fc421e525c4ca1df-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal\" (UID: \"2e3bdf790e513236fc421e525c4ca1df\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal" Mar 18 16:44:20.474763 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.474697 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cf17cac2b800afcc6c2fd3a4eb5e5a86-config\") pod \"kube-apiserver-proxy-ip-10-0-139-43.ec2.internal\" (UID: \"cf17cac2b800afcc6c2fd3a4eb5e5a86\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-43.ec2.internal" Mar 18 16:44:20.474763 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.474675 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e3bdf790e513236fc421e525c4ca1df-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal\" (UID: \"2e3bdf790e513236fc421e525c4ca1df\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal" Mar 18 16:44:20.554761 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.554724 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal" Mar 18 16:44:20.556257 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.556239 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-43.ec2.internal" Mar 18 16:44:20.559927 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:20.559911 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-43.ec2.internal\" not found" Mar 18 16:44:20.660591 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:20.660492 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-43.ec2.internal\" not found" Mar 18 16:44:20.761006 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:20.760945 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-43.ec2.internal\" not found" Mar 18 16:44:20.861555 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:20.861525 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-43.ec2.internal\" not found" Mar 18 16:44:20.867698 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.867669 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 16:44:20.867860 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.867844 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:44:20.867899 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.867857 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:44:20.960674 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.960599 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-03-17 16:39:19 +0000 UTC" deadline="2027-12-28 14:17:42.118134772 +0000 UTC" Mar 18 16:44:20.960674 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.960635 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15597h33m21.15750336s" Mar 18 16:44:20.961686 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:20.961666 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-43.ec2.internal\" not found" Mar 18 16:44:20.970865 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.970836 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Mar 18 16:44:20.986422 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:20.986396 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:44:21.013066 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.013041 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-qc9sb" Mar 18 16:44:21.019175 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.019151 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-qc9sb" Mar 18 16:44:21.062336 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:21.062308 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-43.ec2.internal\" not found" Mar 18 16:44:21.103674 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:21.103622 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e3bdf790e513236fc421e525c4ca1df.slice/crio-a95a5e554ca9d4fa9e8961244f0bcf1b423715cb16742b20aa80034119582f1b WatchSource:0}: Error finding container a95a5e554ca9d4fa9e8961244f0bcf1b423715cb16742b20aa80034119582f1b: Status 404 returned error can't find the container with id a95a5e554ca9d4fa9e8961244f0bcf1b423715cb16742b20aa80034119582f1b Mar 18 16:44:21.104077 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:21.104057 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf17cac2b800afcc6c2fd3a4eb5e5a86.slice/crio-918f4908214bd989f31db1a02c4a64d2eba45bd3d46d595d3783be46fd0dd6f3 WatchSource:0}: Error finding container 918f4908214bd989f31db1a02c4a64d2eba45bd3d46d595d3783be46fd0dd6f3: Status 404 returned error can't find the container with id 918f4908214bd989f31db1a02c4a64d2eba45bd3d46d595d3783be46fd0dd6f3 Mar 18 16:44:21.109440 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.109426 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:44:21.123749 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.123712 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal" event={"ID":"2e3bdf790e513236fc421e525c4ca1df","Type":"ContainerStarted","Data":"a95a5e554ca9d4fa9e8961244f0bcf1b423715cb16742b20aa80034119582f1b"} Mar 18 16:44:21.124680 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.124661 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-43.ec2.internal" event={"ID":"cf17cac2b800afcc6c2fd3a4eb5e5a86","Type":"ContainerStarted","Data":"918f4908214bd989f31db1a02c4a64d2eba45bd3d46d595d3783be46fd0dd6f3"} Mar 18 16:44:21.162910 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:21.162882 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-43.ec2.internal\" not found" Mar 18 16:44:21.221056 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.220943 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:21.263549 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:21.263522 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-43.ec2.internal\" not found" Mar 18 16:44:21.364074 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:21.364039 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-43.ec2.internal\" not found" Mar 18 16:44:21.464890 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:21.464864 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-43.ec2.internal\" not found" Mar 18 16:44:21.523661 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.523592 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:21.571594 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.571561 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal" Mar 18 16:44:21.584724 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.584606 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:44:21.585893 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.585662 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-43.ec2.internal" Mar 18 16:44:21.593449 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.593317 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:44:21.947443 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.947411 2571 apiserver.go:52] "Watching apiserver" Mar 18 16:44:21.952852 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.952822 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 18 16:44:21.953360 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.953330 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-kv8gs","kube-system/kube-apiserver-proxy-ip-10-0-139-43.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk","openshift-multus/multus-additional-cni-plugins-fpm5p","openshift-multus/multus-lfgl5","openshift-network-diagnostics/network-check-target-qv92n","openshift-ovn-kubernetes/ovnkube-node-m9kqz","openshift-cluster-node-tuning-operator/tuned-nhkg8","openshift-dns/node-resolver-fd4lp","openshift-image-registry/node-ca-rwpwz","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal","openshift-multus/network-metrics-daemon-jf76n","openshift-network-operator/iptables-alerter-5nt27"] Mar 18 16:44:21.955904 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.955885 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.957740 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.957718 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:21.958664 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.958411 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 18 16:44:21.958664 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.958420 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 18 16:44:21.958664 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.958428 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 18 16:44:21.958664 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.958490 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 18 16:44:21.958890 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.958715 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rq8lg\"" Mar 18 16:44:21.958890 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.958843 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 18 16:44:21.958890 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.958843 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 18 16:44:21.959635 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.959614 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:21.960507 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.960486 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Mar 18 16:44:21.960603 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.960552 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Mar 18 16:44:21.960769 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.960753 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-sf4nn\"" Mar 18 16:44:21.961145 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.960826 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lfgl5" Mar 18 16:44:21.961413 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.961245 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Mar 18 16:44:21.962580 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.962560 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 18 16:44:21.962666 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.962609 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 18 16:44:21.963755 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.963396 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 18 16:44:21.963755 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.963449 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 18 16:44:21.963755 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.963475 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rrr6x\"" Mar 18 16:44:21.963755 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.963399 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 18 16:44:21.963995 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.963830 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-nnrtd\"" Mar 18 16:44:21.964832 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.964814 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:21.964942 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.964914 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kv8gs" Mar 18 16:44:21.965142 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:21.965119 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv92n" podUID="3e4d4190-53f5-422e-ba62-0a231b728c8d" Mar 18 16:44:21.965524 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.965506 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 18 16:44:21.966723 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.966707 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:21.966925 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.966895 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Mar 18 16:44:21.967329 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.967308 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-974rr\"" Mar 18 16:44:21.967416 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.967317 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Mar 18 16:44:21.968276 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.968256 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fd4lp" Mar 18 16:44:21.968535 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.968517 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:44:21.968813 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.968794 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Mar 18 16:44:21.969083 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.969053 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-47rlt\"" Mar 18 16:44:21.970090 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.969935 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 18 16:44:21.970090 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.970035 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rwpwz" Mar 18 16:44:21.970244 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.970090 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-msxvk\"" Mar 18 16:44:21.970587 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.970570 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 18 16:44:21.972075 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.971860 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:21.972075 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:21.971904 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jf76n" podUID="bf79a021-e091-4ae2-bd19-2bd1205de781" Mar 18 16:44:21.972371 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.972349 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-mh6jl\"" Mar 18 16:44:21.972633 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.972612 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 18 16:44:21.972716 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.972637 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 18 16:44:21.973007 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.972989 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 18 16:44:21.973839 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.973816 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5nt27" Mar 18 16:44:21.975604 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.975585 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:44:21.975801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.975787 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 18 16:44:21.975911 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.975895 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 18 16:44:21.975997 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.975943 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kg6fd\"" Mar 18 16:44:21.982769 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.982752 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxw2q\" (UniqueName: \"kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q\") pod \"network-check-target-qv92n\" (UID: \"3e4d4190-53f5-422e-ba62-0a231b728c8d\") " pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:21.982858 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.982781 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chw2c\" (UniqueName: \"kubernetes.io/projected/17730a74-9827-4a73-be22-72ef96f3aeb0-kube-api-access-chw2c\") pod \"node-resolver-fd4lp\" (UID: \"17730a74-9827-4a73-be22-72ef96f3aeb0\") " pod="openshift-dns/node-resolver-fd4lp" Mar 18 16:44:21.982858 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.982806 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-os-release\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:21.982858 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.982851 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rg7s\" (UniqueName: \"kubernetes.io/projected/bf79a021-e091-4ae2-bd19-2bd1205de781-kube-api-access-9rg7s\") pod \"network-metrics-daemon-jf76n\" (UID: \"bf79a021-e091-4ae2-bd19-2bd1205de781\") " pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:21.982997 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.982882 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-host-run-k8s-cni-cncf-io\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:21.982997 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.982913 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.982997 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.982940 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-kubelet\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.982997 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.982961 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-system-cni-dir\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:21.983172 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983015 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-sysconfig\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:21.983172 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983040 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs\") pod \"network-metrics-daemon-jf76n\" (UID: \"bf79a021-e091-4ae2-bd19-2bd1205de781\") " pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:21.983172 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983059 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-etc-kubernetes\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:21.983172 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983084 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae-serviceca\") pod \"node-ca-rwpwz\" (UID: \"8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae\") " pod="openshift-image-registry/node-ca-rwpwz" Mar 18 16:44:21.983172 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983143 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xdp2\" (UniqueName: \"kubernetes.io/projected/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-kube-api-access-6xdp2\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:21.983358 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983175 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-modprobe-d\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:21.983358 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983227 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/97af0190-d870-4459-8d1a-d160247ed29f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:21.983358 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983244 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/97af0190-d870-4459-8d1a-d160247ed29f-registration-dir\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:21.983358 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983270 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/97af0190-d870-4459-8d1a-d160247ed29f-sys-fs\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:21.983358 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983296 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dv8b\" (UniqueName: \"kubernetes.io/projected/3fc7c75c-9d88-4d69-a623-1eb256939d93-kube-api-access-4dv8b\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:21.983358 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983348 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/17730a74-9827-4a73-be22-72ef96f3aeb0-tmp-dir\") pod \"node-resolver-fd4lp\" (UID: \"17730a74-9827-4a73-be22-72ef96f3aeb0\") " pod="openshift-dns/node-resolver-fd4lp" Mar 18 16:44:21.983626 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983383 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-system-cni-dir\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:21.983626 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983409 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-sysctl-conf\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:21.983626 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983439 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-run\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:21.983626 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983467 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6d41cdd0-c3ee-43e6-84d1-051cad027162-konnectivity-ca\") pod \"konnectivity-agent-kv8gs\" (UID: \"6d41cdd0-c3ee-43e6-84d1-051cad027162\") " pod="kube-system/konnectivity-agent-kv8gs" Mar 18 16:44:21.983626 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983489 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae-host\") pod \"node-ca-rwpwz\" (UID: \"8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae\") " pod="openshift-image-registry/node-ca-rwpwz" Mar 18 16:44:21.983626 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983520 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-slash\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.983626 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983582 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-run-systemd\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.983626 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983614 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee65363f-46b9-4194-8b09-6d6e1a39303b-env-overrides\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.983961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983632 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wtrv\" (UniqueName: \"kubernetes.io/projected/ee65363f-46b9-4194-8b09-6d6e1a39303b-kube-api-access-7wtrv\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.983961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983652 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-systemd\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:21.983961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983674 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef75d54c-b5e4-43b6-a240-283b489f554e-tmp\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:21.983961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983712 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/17730a74-9827-4a73-be22-72ef96f3aeb0-hosts-file\") pod \"node-resolver-fd4lp\" (UID: \"17730a74-9827-4a73-be22-72ef96f3aeb0\") " pod="openshift-dns/node-resolver-fd4lp" Mar 18 16:44:21.983961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983751 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5lgf\" (UniqueName: \"kubernetes.io/projected/8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae-kube-api-access-c5lgf\") pod \"node-ca-rwpwz\" (UID: \"8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae\") " pod="openshift-image-registry/node-ca-rwpwz" Mar 18 16:44:21.983961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983784 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-node-log\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.983961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983809 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee65363f-46b9-4194-8b09-6d6e1a39303b-ovnkube-script-lib\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.983961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983832 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-kubernetes\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:21.983961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983856 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-multus-socket-dir-parent\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:21.983961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983878 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-log-socket\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.983961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983909 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee65363f-46b9-4194-8b09-6d6e1a39303b-ovn-node-metrics-cert\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.983961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983928 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-cnibin\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:21.983961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983951 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-host-run-netns\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:21.984527 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.983986 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-host-var-lib-cni-bin\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:21.984527 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984008 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-run-ovn-kubernetes\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.984527 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984031 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:21.984527 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984064 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8sdk\" (UniqueName: \"kubernetes.io/projected/97af0190-d870-4459-8d1a-d160247ed29f-kube-api-access-f8sdk\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:21.984527 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984088 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-host-var-lib-kubelet\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:21.984527 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984111 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-multus-conf-dir\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:21.984527 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984246 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-tuned\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:21.984527 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984282 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/97af0190-d870-4459-8d1a-d160247ed29f-etc-selinux\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:21.984527 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984306 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6d41cdd0-c3ee-43e6-84d1-051cad027162-agent-certs\") pod \"konnectivity-agent-kv8gs\" (UID: \"6d41cdd0-c3ee-43e6-84d1-051cad027162\") " pod="kube-system/konnectivity-agent-kv8gs" Mar 18 16:44:21.984527 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984338 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-run-netns\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.984527 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984359 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-lib-modules\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:21.984527 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984414 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-host-var-lib-cni-multus\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:21.984527 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984461 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-var-lib-openvswitch\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.984527 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984498 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:21.985159 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984535 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/97af0190-d870-4459-8d1a-d160247ed29f-device-dir\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:21.985159 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984566 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3fc7c75c-9d88-4d69-a623-1eb256939d93-multus-daemon-config\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:21.985159 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984590 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-sys\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:21.985159 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984617 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-run-openvswitch\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.985159 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984640 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-run-ovn\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.985159 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984668 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-cni-binary-copy\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:21.985159 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984701 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/97af0190-d870-4459-8d1a-d160247ed29f-socket-dir\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:21.985159 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984729 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-os-release\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:21.985159 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984752 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-var-lib-kubelet\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:21.985159 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984775 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-host\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:21.985159 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984814 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-946dp\" (UniqueName: \"kubernetes.io/projected/ef75d54c-b5e4-43b6-a240-283b489f554e-kube-api-access-946dp\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:21.985159 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984848 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-cni-netd\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.985159 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984871 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-cnibin\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:21.985159 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984906 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-cni-bin\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.985159 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.984949 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:21.985159 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.985015 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-sysctl-d\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:21.985879 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.985039 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-multus-cni-dir\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:21.985879 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.985065 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-hostroot\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:21.985879 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.985088 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-systemd-units\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.985879 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.985111 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-etc-openvswitch\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.985879 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.985148 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee65363f-46b9-4194-8b09-6d6e1a39303b-ovnkube-config\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:21.985879 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.985171 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fc7c75c-9d88-4d69-a623-1eb256939d93-cni-binary-copy\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:21.985879 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:21.985193 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-host-run-multus-certs\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.020134 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.020100 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:39:21 +0000 UTC" deadline="2027-08-27 16:40:56.257251427 +0000 UTC" Mar 18 16:44:22.020242 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.020135 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12647h56m34.237119992s" Mar 18 16:44:22.072485 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.072462 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 18 16:44:22.085561 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085528 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-os-release\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:22.085739 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085574 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rg7s\" (UniqueName: \"kubernetes.io/projected/bf79a021-e091-4ae2-bd19-2bd1205de781-kube-api-access-9rg7s\") pod \"network-metrics-daemon-jf76n\" (UID: \"bf79a021-e091-4ae2-bd19-2bd1205de781\") " pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:22.085739 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085600 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-host-run-k8s-cni-cncf-io\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.085739 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085645 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.085739 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085671 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-kubelet\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.085739 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085698 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-system-cni-dir\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:22.086027 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085704 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-os-release\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:22.086027 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085757 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-kubelet\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.086027 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085771 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-sysconfig\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.086027 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085799 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.086027 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085814 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-system-cni-dir\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:22.086027 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085823 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs\") pod \"network-metrics-daemon-jf76n\" (UID: \"bf79a021-e091-4ae2-bd19-2bd1205de781\") " pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:22.086027 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085838 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-host-run-k8s-cni-cncf-io\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.086027 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085852 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-etc-kubernetes\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.086027 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085880 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae-serviceca\") pod \"node-ca-rwpwz\" (UID: \"8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae\") " pod="openshift-image-registry/node-ca-rwpwz" Mar 18 16:44:22.086027 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085907 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xdp2\" (UniqueName: \"kubernetes.io/projected/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-kube-api-access-6xdp2\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:22.086027 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085906 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-sysconfig\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.086027 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085926 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-modprobe-d\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.086027 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:22.085948 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:22.086027 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085962 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/97af0190-d870-4459-8d1a-d160247ed29f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:22.086027 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.085939 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-etc-kubernetes\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.086720 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086041 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/97af0190-d870-4459-8d1a-d160247ed29f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:22.086720 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:22.086045 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs podName:bf79a021-e091-4ae2-bd19-2bd1205de781 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:22.586004691 +0000 UTC m=+3.115877856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs") pod "network-metrics-daemon-jf76n" (UID: "bf79a021-e091-4ae2-bd19-2bd1205de781") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:22.086720 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086073 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-modprobe-d\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.086720 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086126 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/97af0190-d870-4459-8d1a-d160247ed29f-registration-dir\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:22.086720 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086176 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/97af0190-d870-4459-8d1a-d160247ed29f-sys-fs\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:22.086720 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086202 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dv8b\" (UniqueName: \"kubernetes.io/projected/3fc7c75c-9d88-4d69-a623-1eb256939d93-kube-api-access-4dv8b\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.086720 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086230 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/17730a74-9827-4a73-be22-72ef96f3aeb0-tmp-dir\") pod \"node-resolver-fd4lp\" (UID: \"17730a74-9827-4a73-be22-72ef96f3aeb0\") " pod="openshift-dns/node-resolver-fd4lp" Mar 18 16:44:22.086720 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086232 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/97af0190-d870-4459-8d1a-d160247ed29f-registration-dir\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:22.086720 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086251 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/97af0190-d870-4459-8d1a-d160247ed29f-sys-fs\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:22.086720 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086257 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-system-cni-dir\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.086720 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086292 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/91a1650a-2ef0-48b3-a7c3-38900240dde0-iptables-alerter-script\") pod \"iptables-alerter-5nt27\" (UID: \"91a1650a-2ef0-48b3-a7c3-38900240dde0\") " pod="openshift-network-operator/iptables-alerter-5nt27" Mar 18 16:44:22.086720 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086306 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-system-cni-dir\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.086720 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086323 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-sysctl-conf\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.086720 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086348 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-run\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.086720 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086370 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6d41cdd0-c3ee-43e6-84d1-051cad027162-konnectivity-ca\") pod \"konnectivity-agent-kv8gs\" (UID: \"6d41cdd0-c3ee-43e6-84d1-051cad027162\") " pod="kube-system/konnectivity-agent-kv8gs" Mar 18 16:44:22.086720 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086395 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae-host\") pod \"node-ca-rwpwz\" (UID: \"8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae\") " pod="openshift-image-registry/node-ca-rwpwz" Mar 18 16:44:22.086720 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086404 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae-serviceca\") pod \"node-ca-rwpwz\" (UID: \"8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae\") " pod="openshift-image-registry/node-ca-rwpwz" Mar 18 16:44:22.087469 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086422 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-slash\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.087469 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086445 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-run\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.087469 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086447 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-run-systemd\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.087469 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086482 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-run-systemd\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.087469 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086489 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae-host\") pod \"node-ca-rwpwz\" (UID: \"8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae\") " pod="openshift-image-registry/node-ca-rwpwz" Mar 18 16:44:22.087469 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086483 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee65363f-46b9-4194-8b09-6d6e1a39303b-env-overrides\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.087469 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086489 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-sysctl-conf\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.087469 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086527 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wtrv\" (UniqueName: \"kubernetes.io/projected/ee65363f-46b9-4194-8b09-6d6e1a39303b-kube-api-access-7wtrv\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.087469 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086531 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/17730a74-9827-4a73-be22-72ef96f3aeb0-tmp-dir\") pod \"node-resolver-fd4lp\" (UID: \"17730a74-9827-4a73-be22-72ef96f3aeb0\") " pod="openshift-dns/node-resolver-fd4lp" Mar 18 16:44:22.087469 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086541 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-slash\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.087469 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086554 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-systemd\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.087469 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086581 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef75d54c-b5e4-43b6-a240-283b489f554e-tmp\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.087469 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086605 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/17730a74-9827-4a73-be22-72ef96f3aeb0-hosts-file\") pod \"node-resolver-fd4lp\" (UID: \"17730a74-9827-4a73-be22-72ef96f3aeb0\") " pod="openshift-dns/node-resolver-fd4lp" Mar 18 16:44:22.087469 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086611 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-systemd\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.087469 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086632 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5lgf\" (UniqueName: \"kubernetes.io/projected/8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae-kube-api-access-c5lgf\") pod \"node-ca-rwpwz\" (UID: \"8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae\") " pod="openshift-image-registry/node-ca-rwpwz" Mar 18 16:44:22.087469 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086750 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/17730a74-9827-4a73-be22-72ef96f3aeb0-hosts-file\") pod \"node-resolver-fd4lp\" (UID: \"17730a74-9827-4a73-be22-72ef96f3aeb0\") " pod="openshift-dns/node-resolver-fd4lp" Mar 18 16:44:22.087469 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-node-log\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.087469 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086810 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee65363f-46b9-4194-8b09-6d6e1a39303b-ovnkube-script-lib\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.088254 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086836 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-kubernetes\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.088254 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086860 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-node-log\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.088254 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086881 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-kubernetes\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.088254 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086896 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-multus-socket-dir-parent\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.088254 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086921 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-log-socket\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.088254 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086943 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee65363f-46b9-4194-8b09-6d6e1a39303b-ovn-node-metrics-cert\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.088254 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086967 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-cnibin\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.088254 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086991 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6d41cdd0-c3ee-43e6-84d1-051cad027162-konnectivity-ca\") pod \"konnectivity-agent-kv8gs\" (UID: \"6d41cdd0-c3ee-43e6-84d1-051cad027162\") " pod="kube-system/konnectivity-agent-kv8gs" Mar 18 16:44:22.088254 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087008 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-log-socket\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.088254 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086989 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-multus-socket-dir-parent\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.088254 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.086994 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ee65363f-46b9-4194-8b09-6d6e1a39303b-env-overrides\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.088254 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087029 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-host-run-netns\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.088254 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087065 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-host-var-lib-cni-bin\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.088254 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087068 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-host-run-netns\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.088254 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087077 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-cnibin\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.088254 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087089 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-run-ovn-kubernetes\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.088254 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087119 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-host-var-lib-cni-bin\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.088254 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087130 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:22.089109 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087179 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-run-ovn-kubernetes\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.089109 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087194 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8sdk\" (UniqueName: \"kubernetes.io/projected/97af0190-d870-4459-8d1a-d160247ed29f-kube-api-access-f8sdk\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:22.089109 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087224 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-host-var-lib-kubelet\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.089109 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087228 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 16:44:22.089109 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087248 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-multus-conf-dir\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.089109 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087258 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:22.089109 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087273 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzljh\" (UniqueName: \"kubernetes.io/projected/91a1650a-2ef0-48b3-a7c3-38900240dde0-kube-api-access-hzljh\") pod \"iptables-alerter-5nt27\" (UID: \"91a1650a-2ef0-48b3-a7c3-38900240dde0\") " pod="openshift-network-operator/iptables-alerter-5nt27" Mar 18 16:44:22.089109 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087297 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-tuned\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.089109 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087302 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-multus-conf-dir\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.089109 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087296 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-host-var-lib-kubelet\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.089109 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087317 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/97af0190-d870-4459-8d1a-d160247ed29f-etc-selinux\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:22.089109 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087352 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ee65363f-46b9-4194-8b09-6d6e1a39303b-ovnkube-script-lib\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.089109 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087359 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6d41cdd0-c3ee-43e6-84d1-051cad027162-agent-certs\") pod \"konnectivity-agent-kv8gs\" (UID: \"6d41cdd0-c3ee-43e6-84d1-051cad027162\") " pod="kube-system/konnectivity-agent-kv8gs" Mar 18 16:44:22.089109 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087383 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-run-netns\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.089109 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087403 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-lib-modules\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.089109 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087422 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-host-var-lib-cni-multus\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.089109 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087443 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91a1650a-2ef0-48b3-a7c3-38900240dde0-host-slash\") pod \"iptables-alerter-5nt27\" (UID: \"91a1650a-2ef0-48b3-a7c3-38900240dde0\") " pod="openshift-network-operator/iptables-alerter-5nt27" Mar 18 16:44:22.089844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087455 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/97af0190-d870-4459-8d1a-d160247ed29f-etc-selinux\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:22.089844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087464 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-var-lib-openvswitch\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.089844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087468 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-run-netns\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.089844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087510 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-host-var-lib-cni-multus\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.089844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087546 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:22.089844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087572 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-lib-modules\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.089844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087572 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/97af0190-d870-4459-8d1a-d160247ed29f-device-dir\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:22.089844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087606 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3fc7c75c-9d88-4d69-a623-1eb256939d93-multus-daemon-config\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.089844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087620 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/97af0190-d870-4459-8d1a-d160247ed29f-device-dir\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:22.089844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087628 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-sys\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.089844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087657 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-run-openvswitch\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.089844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087667 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-sys\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.089844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087680 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-run-ovn\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.089844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087703 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-cni-binary-copy\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:22.089844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087730 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/97af0190-d870-4459-8d1a-d160247ed29f-socket-dir\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:22.089844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087753 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-os-release\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.089844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087773 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-var-lib-kubelet\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.090668 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087790 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-host\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.090668 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087812 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-946dp\" (UniqueName: \"kubernetes.io/projected/ef75d54c-b5e4-43b6-a240-283b489f554e-kube-api-access-946dp\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.090668 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087833 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-cni-netd\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.090668 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087852 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-cnibin\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:22.090668 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087872 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-cni-bin\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.090668 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:22.090668 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087915 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-sysctl-d\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.090668 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087952 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-multus-cni-dir\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.090668 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.087986 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-hostroot\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.090668 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088005 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-systemd-units\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.090668 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088025 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-etc-openvswitch\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.090668 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088044 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee65363f-46b9-4194-8b09-6d6e1a39303b-ovnkube-config\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.090668 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088063 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fc7c75c-9d88-4d69-a623-1eb256939d93-cni-binary-copy\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.090668 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088085 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-host-run-multus-certs\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.090668 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088111 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxw2q\" (UniqueName: \"kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q\") pod \"network-check-target-qv92n\" (UID: \"3e4d4190-53f5-422e-ba62-0a231b728c8d\") " pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:22.090668 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088115 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:22.090668 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088134 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chw2c\" (UniqueName: \"kubernetes.io/projected/17730a74-9827-4a73-be22-72ef96f3aeb0-kube-api-access-chw2c\") pod \"node-resolver-fd4lp\" (UID: \"17730a74-9827-4a73-be22-72ef96f3aeb0\") " pod="openshift-dns/node-resolver-fd4lp" Mar 18 16:44:22.091334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088164 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3fc7c75c-9d88-4d69-a623-1eb256939d93-multus-daemon-config\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.091334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088221 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-cni-bin\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.091334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088260 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-systemd-units\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.091334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088296 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-run-openvswitch\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.091334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088333 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-run-ovn\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.091334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088352 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-etc-openvswitch\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.091334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088440 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-host\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.091334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088546 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/97af0190-d870-4459-8d1a-d160247ed29f-socket-dir\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:22.091334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088597 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-os-release\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.091334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088599 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:22.091334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088638 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-var-lib-kubelet\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.091334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088683 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-host-run-multus-certs\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.091334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088701 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-sysctl-d\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.091334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088740 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-cni-binary-copy\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:22.091334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088798 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-multus-cni-dir\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.091334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088838 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ee65363f-46b9-4194-8b09-6d6e1a39303b-ovnkube-config\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.091334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088849 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3fc7c75c-9d88-4d69-a623-1eb256939d93-hostroot\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.091334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088901 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-host-cni-netd\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.091961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.088986 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-cnibin\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:22.091961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.089028 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ee65363f-46b9-4194-8b09-6d6e1a39303b-var-lib-openvswitch\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.091961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.089188 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fc7c75c-9d88-4d69-a623-1eb256939d93-cni-binary-copy\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.091961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.090943 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ef75d54c-b5e4-43b6-a240-283b489f554e-etc-tuned\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.091961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.091089 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee65363f-46b9-4194-8b09-6d6e1a39303b-ovn-node-metrics-cert\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.091961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.091514 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef75d54c-b5e4-43b6-a240-283b489f554e-tmp\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.091961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.091746 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6d41cdd0-c3ee-43e6-84d1-051cad027162-agent-certs\") pod \"konnectivity-agent-kv8gs\" (UID: \"6d41cdd0-c3ee-43e6-84d1-051cad027162\") " pod="kube-system/konnectivity-agent-kv8gs" Mar 18 16:44:22.095386 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:22.095124 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:22.095386 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:22.095147 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:22.095386 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:22.095161 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mxw2q for pod openshift-network-diagnostics/network-check-target-qv92n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:22.095386 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:22.095221 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q podName:3e4d4190-53f5-422e-ba62-0a231b728c8d nodeName:}" failed. No retries permitted until 2026-03-18 16:44:22.595200779 +0000 UTC m=+3.125073938 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mxw2q" (UniqueName: "kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q") pod "network-check-target-qv92n" (UID: "3e4d4190-53f5-422e-ba62-0a231b728c8d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:22.096805 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.096780 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dv8b\" (UniqueName: \"kubernetes.io/projected/3fc7c75c-9d88-4d69-a623-1eb256939d93-kube-api-access-4dv8b\") pod \"multus-lfgl5\" (UID: \"3fc7c75c-9d88-4d69-a623-1eb256939d93\") " pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.097460 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.097428 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xdp2\" (UniqueName: \"kubernetes.io/projected/d2c29efd-c4cf-40cc-91bb-1c82e76eea41-kube-api-access-6xdp2\") pod \"multus-additional-cni-plugins-fpm5p\" (UID: \"d2c29efd-c4cf-40cc-91bb-1c82e76eea41\") " pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:22.097711 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.097669 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rg7s\" (UniqueName: \"kubernetes.io/projected/bf79a021-e091-4ae2-bd19-2bd1205de781-kube-api-access-9rg7s\") pod \"network-metrics-daemon-jf76n\" (UID: \"bf79a021-e091-4ae2-bd19-2bd1205de781\") " pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:22.098673 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.098652 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wtrv\" (UniqueName: \"kubernetes.io/projected/ee65363f-46b9-4194-8b09-6d6e1a39303b-kube-api-access-7wtrv\") pod \"ovnkube-node-m9kqz\" (UID: \"ee65363f-46b9-4194-8b09-6d6e1a39303b\") " pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.099418 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.099367 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-946dp\" (UniqueName: \"kubernetes.io/projected/ef75d54c-b5e4-43b6-a240-283b489f554e-kube-api-access-946dp\") pod \"tuned-nhkg8\" (UID: \"ef75d54c-b5e4-43b6-a240-283b489f554e\") " pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.099740 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.099720 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:22.099883 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.099755 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8sdk\" (UniqueName: \"kubernetes.io/projected/97af0190-d870-4459-8d1a-d160247ed29f-kube-api-access-f8sdk\") pod \"aws-ebs-csi-driver-node-k5jxk\" (UID: \"97af0190-d870-4459-8d1a-d160247ed29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:22.100393 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.100374 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chw2c\" (UniqueName: \"kubernetes.io/projected/17730a74-9827-4a73-be22-72ef96f3aeb0-kube-api-access-chw2c\") pod \"node-resolver-fd4lp\" (UID: \"17730a74-9827-4a73-be22-72ef96f3aeb0\") " pod="openshift-dns/node-resolver-fd4lp" Mar 18 16:44:22.100498 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.100480 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5lgf\" (UniqueName: \"kubernetes.io/projected/8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae-kube-api-access-c5lgf\") pod \"node-ca-rwpwz\" (UID: \"8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae\") " pod="openshift-image-registry/node-ca-rwpwz" Mar 18 16:44:22.123354 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.123327 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:22.188486 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.188454 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/91a1650a-2ef0-48b3-a7c3-38900240dde0-iptables-alerter-script\") pod \"iptables-alerter-5nt27\" (UID: \"91a1650a-2ef0-48b3-a7c3-38900240dde0\") " pod="openshift-network-operator/iptables-alerter-5nt27" Mar 18 16:44:22.188637 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.188503 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzljh\" (UniqueName: \"kubernetes.io/projected/91a1650a-2ef0-48b3-a7c3-38900240dde0-kube-api-access-hzljh\") pod \"iptables-alerter-5nt27\" (UID: \"91a1650a-2ef0-48b3-a7c3-38900240dde0\") " pod="openshift-network-operator/iptables-alerter-5nt27" Mar 18 16:44:22.188637 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.188520 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91a1650a-2ef0-48b3-a7c3-38900240dde0-host-slash\") pod \"iptables-alerter-5nt27\" (UID: \"91a1650a-2ef0-48b3-a7c3-38900240dde0\") " pod="openshift-network-operator/iptables-alerter-5nt27" Mar 18 16:44:22.188637 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.188575 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91a1650a-2ef0-48b3-a7c3-38900240dde0-host-slash\") pod \"iptables-alerter-5nt27\" (UID: \"91a1650a-2ef0-48b3-a7c3-38900240dde0\") " pod="openshift-network-operator/iptables-alerter-5nt27" Mar 18 16:44:22.188856 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.188837 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/91a1650a-2ef0-48b3-a7c3-38900240dde0-iptables-alerter-script\") pod \"iptables-alerter-5nt27\" (UID: \"91a1650a-2ef0-48b3-a7c3-38900240dde0\") " pod="openshift-network-operator/iptables-alerter-5nt27" Mar 18 16:44:22.197391 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.197361 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzljh\" (UniqueName: \"kubernetes.io/projected/91a1650a-2ef0-48b3-a7c3-38900240dde0-kube-api-access-hzljh\") pod \"iptables-alerter-5nt27\" (UID: \"91a1650a-2ef0-48b3-a7c3-38900240dde0\") " pod="openshift-network-operator/iptables-alerter-5nt27" Mar 18 16:44:22.275408 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.275330 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:22.281572 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.281548 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" Mar 18 16:44:22.290232 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.290212 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fpm5p" Mar 18 16:44:22.295820 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.295802 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lfgl5" Mar 18 16:44:22.302533 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.302508 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kv8gs" Mar 18 16:44:22.308096 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.308076 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" Mar 18 16:44:22.316613 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.316592 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fd4lp" Mar 18 16:44:22.322195 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.322177 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rwpwz" Mar 18 16:44:22.327674 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.327657 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5nt27" Mar 18 16:44:22.591621 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.591548 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs\") pod \"network-metrics-daemon-jf76n\" (UID: \"bf79a021-e091-4ae2-bd19-2bd1205de781\") " pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:22.591774 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:22.591671 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:22.591774 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:22.591730 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs podName:bf79a021-e091-4ae2-bd19-2bd1205de781 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:23.591713965 +0000 UTC m=+4.121587124 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs") pod "network-metrics-daemon-jf76n" (UID: "bf79a021-e091-4ae2-bd19-2bd1205de781") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:22.691990 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:22.691946 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxw2q\" (UniqueName: \"kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q\") pod \"network-check-target-qv92n\" (UID: \"3e4d4190-53f5-422e-ba62-0a231b728c8d\") " pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:22.692150 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:22.692081 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:22.692150 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:22.692099 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:22.692150 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:22.692108 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mxw2q for pod openshift-network-diagnostics/network-check-target-qv92n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:22.692276 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:22.692157 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q podName:3e4d4190-53f5-422e-ba62-0a231b728c8d nodeName:}" failed. No retries permitted until 2026-03-18 16:44:23.692142771 +0000 UTC m=+4.222015934 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mxw2q" (UniqueName: "kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q") pod "network-check-target-qv92n" (UID: "3e4d4190-53f5-422e-ba62-0a231b728c8d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:22.741403 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:22.741379 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17730a74_9827_4a73_be22_72ef96f3aeb0.slice/crio-664530328a768f8c5dadf1704e1fe1690e55289a15ef1b316f9c3a153c702db9 WatchSource:0}: Error finding container 664530328a768f8c5dadf1704e1fe1690e55289a15ef1b316f9c3a153c702db9: Status 404 returned error can't find the container with id 664530328a768f8c5dadf1704e1fe1690e55289a15ef1b316f9c3a153c702db9 Mar 18 16:44:22.743603 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:22.743579 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee65363f_46b9_4194_8b09_6d6e1a39303b.slice/crio-e11ebd2b984ce0878eeab938a3066264e853e7eb31b28c1c8c01bc1a1817a5c1 WatchSource:0}: Error finding container e11ebd2b984ce0878eeab938a3066264e853e7eb31b28c1c8c01bc1a1817a5c1: Status 404 returned error can't find the container with id e11ebd2b984ce0878eeab938a3066264e853e7eb31b28c1c8c01bc1a1817a5c1 Mar 18 16:44:22.744230 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:22.744207 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef75d54c_b5e4_43b6_a240_283b489f554e.slice/crio-efc185d4d7f87044b2a4f39683f205071b9efe9566cf7c357d5e1e0f220c9cf0 WatchSource:0}: Error finding container efc185d4d7f87044b2a4f39683f205071b9efe9566cf7c357d5e1e0f220c9cf0: Status 404 returned error can't find the container with id efc185d4d7f87044b2a4f39683f205071b9efe9566cf7c357d5e1e0f220c9cf0 Mar 18 16:44:22.745243 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:22.745133 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fc7c75c_9d88_4d69_a623_1eb256939d93.slice/crio-98c0be24aeac90f6bdd2de4c24fdec74fbe93a3696adbe7fb2ffd299a908bb4a WatchSource:0}: Error finding container 98c0be24aeac90f6bdd2de4c24fdec74fbe93a3696adbe7fb2ffd299a908bb4a: Status 404 returned error can't find the container with id 98c0be24aeac90f6bdd2de4c24fdec74fbe93a3696adbe7fb2ffd299a908bb4a Mar 18 16:44:22.747493 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:22.747407 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2c29efd_c4cf_40cc_91bb_1c82e76eea41.slice/crio-43b0de2821b62b8b19c05c9081a1c507865303813583c0357bed5867ac8a7153 WatchSource:0}: Error finding container 43b0de2821b62b8b19c05c9081a1c507865303813583c0357bed5867ac8a7153: Status 404 returned error can't find the container with id 43b0de2821b62b8b19c05c9081a1c507865303813583c0357bed5867ac8a7153 Mar 18 16:44:22.749717 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:22.749600 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8197d1a6_4bd6_4f2e_9e43_e0bd2dff50ae.slice/crio-e3b1722d78434ffab7cd6426250caeca323a60c42cda43df700a1b93d4e03c24 WatchSource:0}: Error finding container e3b1722d78434ffab7cd6426250caeca323a60c42cda43df700a1b93d4e03c24: Status 404 returned error can't find the container with id e3b1722d78434ffab7cd6426250caeca323a60c42cda43df700a1b93d4e03c24 Mar 18 16:44:22.750392 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:22.750368 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91a1650a_2ef0_48b3_a7c3_38900240dde0.slice/crio-f63831655823af838949b84233d01779a4318a8e6d748f5a10060228352980c0 WatchSource:0}: Error finding container f63831655823af838949b84233d01779a4318a8e6d748f5a10060228352980c0: Status 404 returned error can't find the container with id f63831655823af838949b84233d01779a4318a8e6d748f5a10060228352980c0 Mar 18 16:44:22.752077 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:22.751955 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97af0190_d870_4459_8d1a_d160247ed29f.slice/crio-2d2d6f637832610ea64ca721e665555668ca39189cf6bab6f4f9d659b5639e1d WatchSource:0}: Error finding container 2d2d6f637832610ea64ca721e665555668ca39189cf6bab6f4f9d659b5639e1d: Status 404 returned error can't find the container with id 2d2d6f637832610ea64ca721e665555668ca39189cf6bab6f4f9d659b5639e1d Mar 18 16:44:22.753946 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:44:22.753892 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d41cdd0_c3ee_43e6_84d1_051cad027162.slice/crio-3d0d3847e19c5725f79cec1ad03386daa565a66fb70024ef0dc39478e1c13c49 WatchSource:0}: Error finding container 3d0d3847e19c5725f79cec1ad03386daa565a66fb70024ef0dc39478e1c13c49: Status 404 returned error can't find the container with id 3d0d3847e19c5725f79cec1ad03386daa565a66fb70024ef0dc39478e1c13c49 Mar 18 16:44:23.020936 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:23.020725 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:39:21 +0000 UTC" deadline="2027-08-27 21:40:06.374514922 +0000 UTC" Mar 18 16:44:23.020936 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:23.020933 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12652h55m43.353596933s" Mar 18 16:44:23.128406 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:23.128362 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-43.ec2.internal" event={"ID":"cf17cac2b800afcc6c2fd3a4eb5e5a86","Type":"ContainerStarted","Data":"9d59ea1ee8e490409f5542367bbc2a27b8e6f1f770ce25691e484093f3ad4cd6"} Mar 18 16:44:23.129645 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:23.129614 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kv8gs" event={"ID":"6d41cdd0-c3ee-43e6-84d1-051cad027162","Type":"ContainerStarted","Data":"3d0d3847e19c5725f79cec1ad03386daa565a66fb70024ef0dc39478e1c13c49"} Mar 18 16:44:23.130807 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:23.130764 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5nt27" event={"ID":"91a1650a-2ef0-48b3-a7c3-38900240dde0","Type":"ContainerStarted","Data":"f63831655823af838949b84233d01779a4318a8e6d748f5a10060228352980c0"} Mar 18 16:44:23.131945 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:23.131917 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rwpwz" event={"ID":"8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae","Type":"ContainerStarted","Data":"e3b1722d78434ffab7cd6426250caeca323a60c42cda43df700a1b93d4e03c24"} Mar 18 16:44:23.133539 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:23.133504 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" event={"ID":"ee65363f-46b9-4194-8b09-6d6e1a39303b","Type":"ContainerStarted","Data":"e11ebd2b984ce0878eeab938a3066264e853e7eb31b28c1c8c01bc1a1817a5c1"} Mar 18 16:44:23.134761 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:23.134729 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" event={"ID":"97af0190-d870-4459-8d1a-d160247ed29f","Type":"ContainerStarted","Data":"2d2d6f637832610ea64ca721e665555668ca39189cf6bab6f4f9d659b5639e1d"} Mar 18 16:44:23.135720 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:23.135689 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fpm5p" event={"ID":"d2c29efd-c4cf-40cc-91bb-1c82e76eea41","Type":"ContainerStarted","Data":"43b0de2821b62b8b19c05c9081a1c507865303813583c0357bed5867ac8a7153"} Mar 18 16:44:23.137104 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:23.137060 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lfgl5" event={"ID":"3fc7c75c-9d88-4d69-a623-1eb256939d93","Type":"ContainerStarted","Data":"98c0be24aeac90f6bdd2de4c24fdec74fbe93a3696adbe7fb2ffd299a908bb4a"} Mar 18 16:44:23.138228 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:23.138207 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" event={"ID":"ef75d54c-b5e4-43b6-a240-283b489f554e","Type":"ContainerStarted","Data":"efc185d4d7f87044b2a4f39683f205071b9efe9566cf7c357d5e1e0f220c9cf0"} Mar 18 16:44:23.139477 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:23.139453 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fd4lp" event={"ID":"17730a74-9827-4a73-be22-72ef96f3aeb0","Type":"ContainerStarted","Data":"664530328a768f8c5dadf1704e1fe1690e55289a15ef1b316f9c3a153c702db9"} Mar 18 16:44:23.143630 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:23.143582 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-43.ec2.internal" podStartSLOduration=2.143571266 podStartE2EDuration="2.143571266s" podCreationTimestamp="2026-03-18 16:44:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:23.143033101 +0000 UTC m=+3.672906282" watchObservedRunningTime="2026-03-18 16:44:23.143571266 +0000 UTC m=+3.673444445" Mar 18 16:44:23.599272 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:23.599230 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs\") pod \"network-metrics-daemon-jf76n\" (UID: \"bf79a021-e091-4ae2-bd19-2bd1205de781\") " pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:23.599488 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:23.599420 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:23.599488 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:23.599485 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs podName:bf79a021-e091-4ae2-bd19-2bd1205de781 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:25.599466584 +0000 UTC m=+6.129339748 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs") pod "network-metrics-daemon-jf76n" (UID: "bf79a021-e091-4ae2-bd19-2bd1205de781") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:23.700942 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:23.700291 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxw2q\" (UniqueName: \"kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q\") pod \"network-check-target-qv92n\" (UID: \"3e4d4190-53f5-422e-ba62-0a231b728c8d\") " pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:23.700942 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:23.700483 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:23.700942 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:23.700504 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:23.700942 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:23.700516 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mxw2q for pod openshift-network-diagnostics/network-check-target-qv92n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:23.700942 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:23.700576 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q podName:3e4d4190-53f5-422e-ba62-0a231b728c8d nodeName:}" failed. No retries permitted until 2026-03-18 16:44:25.700558275 +0000 UTC m=+6.230431437 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mxw2q" (UniqueName: "kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q") pod "network-check-target-qv92n" (UID: "3e4d4190-53f5-422e-ba62-0a231b728c8d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:24.124373 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:24.123609 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:24.124373 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:24.123733 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv92n" podUID="3e4d4190-53f5-422e-ba62-0a231b728c8d" Mar 18 16:44:24.124373 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:24.124113 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:24.124373 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:24.124225 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jf76n" podUID="bf79a021-e091-4ae2-bd19-2bd1205de781" Mar 18 16:44:24.160859 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:24.160000 2571 generic.go:358] "Generic (PLEG): container finished" podID="2e3bdf790e513236fc421e525c4ca1df" containerID="fb8363d4cf57109e28064bcfd2208cfec55dadf6717da0182d9d273cdce29070" exitCode=0 Mar 18 16:44:24.160859 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:24.160555 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal" event={"ID":"2e3bdf790e513236fc421e525c4ca1df","Type":"ContainerDied","Data":"fb8363d4cf57109e28064bcfd2208cfec55dadf6717da0182d9d273cdce29070"} Mar 18 16:44:25.174566 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:25.173763 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal" event={"ID":"2e3bdf790e513236fc421e525c4ca1df","Type":"ContainerStarted","Data":"ab8ce4b286c30ebf8eb869724e64e8373f7a741e2e4c4f4ebc15ee59b502e204"} Mar 18 16:44:25.191180 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:25.190106 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-43.ec2.internal" podStartSLOduration=4.190087658 podStartE2EDuration="4.190087658s" podCreationTimestamp="2026-03-18 16:44:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:25.189816045 +0000 UTC m=+5.719689227" watchObservedRunningTime="2026-03-18 16:44:25.190087658 +0000 UTC m=+5.719960842" Mar 18 16:44:25.615325 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:25.615222 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs\") pod \"network-metrics-daemon-jf76n\" (UID: \"bf79a021-e091-4ae2-bd19-2bd1205de781\") " pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:25.615474 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:25.615378 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:25.615474 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:25.615456 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs podName:bf79a021-e091-4ae2-bd19-2bd1205de781 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:29.615437142 +0000 UTC m=+10.145310305 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs") pod "network-metrics-daemon-jf76n" (UID: "bf79a021-e091-4ae2-bd19-2bd1205de781") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:25.716607 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:25.716572 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxw2q\" (UniqueName: \"kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q\") pod \"network-check-target-qv92n\" (UID: \"3e4d4190-53f5-422e-ba62-0a231b728c8d\") " pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:25.716802 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:25.716730 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:25.716802 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:25.716749 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:25.716802 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:25.716762 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mxw2q for pod openshift-network-diagnostics/network-check-target-qv92n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:25.717041 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:25.716819 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q podName:3e4d4190-53f5-422e-ba62-0a231b728c8d nodeName:}" failed. No retries permitted until 2026-03-18 16:44:29.716801084 +0000 UTC m=+10.246674248 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mxw2q" (UniqueName: "kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q") pod "network-check-target-qv92n" (UID: "3e4d4190-53f5-422e-ba62-0a231b728c8d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:26.123638 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:26.122858 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:26.123638 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:26.123017 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jf76n" podUID="bf79a021-e091-4ae2-bd19-2bd1205de781" Mar 18 16:44:26.123638 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:26.123456 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:26.123638 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:26.123555 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv92n" podUID="3e4d4190-53f5-422e-ba62-0a231b728c8d" Mar 18 16:44:28.122251 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:28.121711 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:28.122251 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:28.121742 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:28.122251 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:28.121868 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jf76n" podUID="bf79a021-e091-4ae2-bd19-2bd1205de781" Mar 18 16:44:28.122251 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:28.122000 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv92n" podUID="3e4d4190-53f5-422e-ba62-0a231b728c8d" Mar 18 16:44:29.652052 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:29.652003 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs\") pod \"network-metrics-daemon-jf76n\" (UID: \"bf79a021-e091-4ae2-bd19-2bd1205de781\") " pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:29.652522 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:29.652258 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:29.652522 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:29.652336 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs podName:bf79a021-e091-4ae2-bd19-2bd1205de781 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:37.652316882 +0000 UTC m=+18.182190056 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs") pod "network-metrics-daemon-jf76n" (UID: "bf79a021-e091-4ae2-bd19-2bd1205de781") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:29.752473 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:29.752422 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxw2q\" (UniqueName: \"kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q\") pod \"network-check-target-qv92n\" (UID: \"3e4d4190-53f5-422e-ba62-0a231b728c8d\") " pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:29.752640 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:29.752577 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:29.752640 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:29.752599 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:29.752640 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:29.752610 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mxw2q for pod openshift-network-diagnostics/network-check-target-qv92n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:29.752808 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:29.752675 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q podName:3e4d4190-53f5-422e-ba62-0a231b728c8d nodeName:}" failed. No retries permitted until 2026-03-18 16:44:37.752655443 +0000 UTC m=+18.282528615 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mxw2q" (UniqueName: "kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q") pod "network-check-target-qv92n" (UID: "3e4d4190-53f5-422e-ba62-0a231b728c8d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:30.122581 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:30.122503 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:30.122752 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:30.122629 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv92n" podUID="3e4d4190-53f5-422e-ba62-0a231b728c8d" Mar 18 16:44:30.122752 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:30.122642 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:30.122752 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:30.122732 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jf76n" podUID="bf79a021-e091-4ae2-bd19-2bd1205de781" Mar 18 16:44:32.121245 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:32.121208 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:32.121687 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:32.121251 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:32.121687 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:32.121361 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jf76n" podUID="bf79a021-e091-4ae2-bd19-2bd1205de781" Mar 18 16:44:32.121687 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:32.121471 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv92n" podUID="3e4d4190-53f5-422e-ba62-0a231b728c8d" Mar 18 16:44:34.121208 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:34.121168 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:34.121208 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:34.121202 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:34.121691 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:34.121290 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv92n" podUID="3e4d4190-53f5-422e-ba62-0a231b728c8d" Mar 18 16:44:34.121691 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:34.121419 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jf76n" podUID="bf79a021-e091-4ae2-bd19-2bd1205de781" Mar 18 16:44:36.121169 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:36.121132 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:36.121644 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:36.121246 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv92n" podUID="3e4d4190-53f5-422e-ba62-0a231b728c8d" Mar 18 16:44:36.121644 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:36.121305 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:36.121644 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:36.121435 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jf76n" podUID="bf79a021-e091-4ae2-bd19-2bd1205de781" Mar 18 16:44:37.706604 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:37.706566 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs\") pod \"network-metrics-daemon-jf76n\" (UID: \"bf79a021-e091-4ae2-bd19-2bd1205de781\") " pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:37.706954 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:37.706682 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:37.706954 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:37.706739 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs podName:bf79a021-e091-4ae2-bd19-2bd1205de781 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:53.706722979 +0000 UTC m=+34.236596139 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs") pod "network-metrics-daemon-jf76n" (UID: "bf79a021-e091-4ae2-bd19-2bd1205de781") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:37.807269 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:37.807225 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxw2q\" (UniqueName: \"kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q\") pod \"network-check-target-qv92n\" (UID: \"3e4d4190-53f5-422e-ba62-0a231b728c8d\") " pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:37.807416 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:37.807371 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:37.807416 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:37.807391 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:37.807416 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:37.807405 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mxw2q for pod openshift-network-diagnostics/network-check-target-qv92n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:37.807548 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:37.807465 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q podName:3e4d4190-53f5-422e-ba62-0a231b728c8d nodeName:}" failed. No retries permitted until 2026-03-18 16:44:53.807449537 +0000 UTC m=+34.337322699 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mxw2q" (UniqueName: "kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q") pod "network-check-target-qv92n" (UID: "3e4d4190-53f5-422e-ba62-0a231b728c8d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:38.121334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:38.121242 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:38.121497 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:38.121265 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:38.121497 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:38.121384 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv92n" podUID="3e4d4190-53f5-422e-ba62-0a231b728c8d" Mar 18 16:44:38.121620 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:38.121506 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jf76n" podUID="bf79a021-e091-4ae2-bd19-2bd1205de781" Mar 18 16:44:40.122811 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:40.122352 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:40.123484 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:40.122431 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:40.123484 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:40.122922 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv92n" podUID="3e4d4190-53f5-422e-ba62-0a231b728c8d" Mar 18 16:44:40.123484 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:40.123051 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jf76n" podUID="bf79a021-e091-4ae2-bd19-2bd1205de781" Mar 18 16:44:40.202068 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:40.202035 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kv8gs" event={"ID":"6d41cdd0-c3ee-43e6-84d1-051cad027162","Type":"ContainerStarted","Data":"70303f4b94a1ebfaa8337566f39e6fb05fdad719bb66e9d49e10d8d76846d6fd"} Mar 18 16:44:40.203432 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:40.203403 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rwpwz" event={"ID":"8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae","Type":"ContainerStarted","Data":"29bfc0b62e80695dfc2ba595979590527bc6d0c055499cdc66a3bee60143e431"} Mar 18 16:44:40.204889 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:40.204866 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" event={"ID":"ee65363f-46b9-4194-8b09-6d6e1a39303b","Type":"ContainerStarted","Data":"5258ec722a35fb88809a9b34418423f91a2053c4ac18a620de68b220dca4b0b9"} Mar 18 16:44:40.204889 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:40.204896 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" event={"ID":"ee65363f-46b9-4194-8b09-6d6e1a39303b","Type":"ContainerStarted","Data":"295f3bdadd3d15ddfc2a5e236dd623229d03250ef2020e8d0e36d6e81c4f0a5c"} Mar 18 16:44:40.206279 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:40.206255 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" event={"ID":"97af0190-d870-4459-8d1a-d160247ed29f","Type":"ContainerStarted","Data":"169f01317fcdc61e8c849b08116ca8ae02f4e9c5940f377d3476a0f38d8d1919"} Mar 18 16:44:40.207786 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:40.207759 2571 generic.go:358] "Generic (PLEG): container finished" podID="d2c29efd-c4cf-40cc-91bb-1c82e76eea41" containerID="144ec6afc941c0bc402d713c3ef165943cd1bf7be3c93f29b31e2568097826d2" exitCode=0 Mar 18 16:44:40.207869 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:40.207843 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fpm5p" event={"ID":"d2c29efd-c4cf-40cc-91bb-1c82e76eea41","Type":"ContainerDied","Data":"144ec6afc941c0bc402d713c3ef165943cd1bf7be3c93f29b31e2568097826d2"} Mar 18 16:44:40.209353 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:40.209335 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lfgl5" event={"ID":"3fc7c75c-9d88-4d69-a623-1eb256939d93","Type":"ContainerStarted","Data":"c5c2a7f3cbe0fa5bad1ca107ef364d92362acd873693e8c717b3cd6acfa69238"} Mar 18 16:44:40.211317 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:40.210655 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" event={"ID":"ef75d54c-b5e4-43b6-a240-283b489f554e","Type":"ContainerStarted","Data":"385b3f179a175a7b3abdda61fd17c3fc4e3a48baaf8b892331fc77923457e286"} Mar 18 16:44:40.211965 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:40.211947 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fd4lp" event={"ID":"17730a74-9827-4a73-be22-72ef96f3aeb0","Type":"ContainerStarted","Data":"015d860289af08286ddfb843070626ec4bf45f0c42c1e695fee148c6baed0d12"} Mar 18 16:44:40.218230 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:40.218196 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-kv8gs" podStartSLOduration=3.403207343 podStartE2EDuration="20.21818552s" podCreationTimestamp="2026-03-18 16:44:20 +0000 UTC" firstStartedPulling="2026-03-18 16:44:22.756148586 +0000 UTC m=+3.286021748" lastFinishedPulling="2026-03-18 16:44:39.571126766 +0000 UTC m=+20.100999925" observedRunningTime="2026-03-18 16:44:40.217527514 +0000 UTC m=+20.747400695" watchObservedRunningTime="2026-03-18 16:44:40.21818552 +0000 UTC m=+20.748058699" Mar 18 16:44:40.236297 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:40.236259 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lfgl5" podStartSLOduration=3.376598355 podStartE2EDuration="20.23624602s" podCreationTimestamp="2026-03-18 16:44:20 +0000 UTC" firstStartedPulling="2026-03-18 16:44:22.747388933 +0000 UTC m=+3.277262096" lastFinishedPulling="2026-03-18 16:44:39.607036591 +0000 UTC m=+20.136909761" observedRunningTime="2026-03-18 16:44:40.235894892 +0000 UTC m=+20.765768072" watchObservedRunningTime="2026-03-18 16:44:40.23624602 +0000 UTC m=+20.766119199" Mar 18 16:44:40.252235 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:40.252188 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fd4lp" podStartSLOduration=3.785397192 podStartE2EDuration="20.252173928s" podCreationTimestamp="2026-03-18 16:44:20 +0000 UTC" firstStartedPulling="2026-03-18 16:44:22.742865358 +0000 UTC m=+3.272738516" lastFinishedPulling="2026-03-18 16:44:39.209642094 +0000 UTC m=+19.739515252" observedRunningTime="2026-03-18 16:44:40.251866159 +0000 UTC m=+20.781739342" watchObservedRunningTime="2026-03-18 16:44:40.252173928 +0000 UTC m=+20.782047107" Mar 18 16:44:40.304200 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:40.304157 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-nhkg8" podStartSLOduration=3.480929319 podStartE2EDuration="20.304143566s" podCreationTimestamp="2026-03-18 16:44:20 +0000 UTC" firstStartedPulling="2026-03-18 16:44:22.747785246 +0000 UTC m=+3.277658405" lastFinishedPulling="2026-03-18 16:44:39.57099948 +0000 UTC m=+20.100872652" observedRunningTime="2026-03-18 16:44:40.303851255 +0000 UTC m=+20.833724434" watchObservedRunningTime="2026-03-18 16:44:40.304143566 +0000 UTC m=+20.834016745" Mar 18 16:44:41.207797 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:41.207753 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-kv8gs" Mar 18 16:44:41.208711 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:41.208690 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-kv8gs" Mar 18 16:44:41.215513 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:41.215485 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5nt27" event={"ID":"91a1650a-2ef0-48b3-a7c3-38900240dde0","Type":"ContainerStarted","Data":"e92ef0879513986e9f8058e98c4f41728cd2ab2df1fb50d9834e82e1464b9a8e"} Mar 18 16:44:41.218582 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:41.218557 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" event={"ID":"ee65363f-46b9-4194-8b09-6d6e1a39303b","Type":"ContainerStarted","Data":"16b3b0eec32c63a488f774601954991195b4ecfc9f466cdad258fa003c5d35ac"} Mar 18 16:44:41.218669 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:41.218586 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" event={"ID":"ee65363f-46b9-4194-8b09-6d6e1a39303b","Type":"ContainerStarted","Data":"49d9586253c124ac1b491936165b44b42ed94d5087b24b8e4a6894ba4f0965ce"} Mar 18 16:44:41.218669 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:41.218600 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" event={"ID":"ee65363f-46b9-4194-8b09-6d6e1a39303b","Type":"ContainerStarted","Data":"5e2ad764fe37cfb094ffc1c9b8591e85358766b2629c84075f37212f0255a144"} Mar 18 16:44:41.218669 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:41.218612 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" event={"ID":"ee65363f-46b9-4194-8b09-6d6e1a39303b","Type":"ContainerStarted","Data":"466b03415fb5e39f2c76a7bd62f9d6da1392900831c8a3847a90c0411336e083"} Mar 18 16:44:41.219373 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:41.219087 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-kv8gs" Mar 18 16:44:41.219591 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:41.219576 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-kv8gs" Mar 18 16:44:41.236501 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:41.236460 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rwpwz" podStartSLOduration=4.416775518 podStartE2EDuration="21.236447391s" podCreationTimestamp="2026-03-18 16:44:20 +0000 UTC" firstStartedPulling="2026-03-18 16:44:22.751480272 +0000 UTC m=+3.281353447" lastFinishedPulling="2026-03-18 16:44:39.571152154 +0000 UTC m=+20.101025320" observedRunningTime="2026-03-18 16:44:40.32080658 +0000 UTC m=+20.850679778" watchObservedRunningTime="2026-03-18 16:44:41.236447391 +0000 UTC m=+21.766320572" Mar 18 16:44:41.259237 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:41.259193 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5nt27" podStartSLOduration=4.465396695 podStartE2EDuration="21.259179066s" podCreationTimestamp="2026-03-18 16:44:20 +0000 UTC" firstStartedPulling="2026-03-18 16:44:22.752183694 +0000 UTC m=+3.282056852" lastFinishedPulling="2026-03-18 16:44:39.545966056 +0000 UTC m=+20.075839223" observedRunningTime="2026-03-18 16:44:41.259154864 +0000 UTC m=+21.789028060" watchObservedRunningTime="2026-03-18 16:44:41.259179066 +0000 UTC m=+21.789052228" Mar 18 16:44:41.447349 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:41.447275 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Mar 18 16:44:42.041984 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:42.041874 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-03-18T16:44:41.447296524Z","UUID":"a2083cd5-757f-494f-a8bc-5e1d63d20643","Handler":null,"Name":"","Endpoint":""} Mar 18 16:44:42.043952 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:42.043918 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Mar 18 16:44:42.044096 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:42.043966 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Mar 18 16:44:42.121086 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:42.121052 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:42.121258 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:42.121177 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv92n" podUID="3e4d4190-53f5-422e-ba62-0a231b728c8d" Mar 18 16:44:42.121452 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:42.121417 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:42.121562 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:42.121547 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jf76n" podUID="bf79a021-e091-4ae2-bd19-2bd1205de781" Mar 18 16:44:42.223398 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:42.223351 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" event={"ID":"97af0190-d870-4459-8d1a-d160247ed29f","Type":"ContainerStarted","Data":"64a83bd8bd4b603d9a26fe2b783990a2276a4e55a1a9458c7f5e19468b4de74c"} Mar 18 16:44:44.122027 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:44.121989 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:44.122488 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:44.122032 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:44.122488 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:44.122123 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv92n" podUID="3e4d4190-53f5-422e-ba62-0a231b728c8d" Mar 18 16:44:44.122488 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:44.122236 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jf76n" podUID="bf79a021-e091-4ae2-bd19-2bd1205de781" Mar 18 16:44:45.232267 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:45.232072 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" event={"ID":"ee65363f-46b9-4194-8b09-6d6e1a39303b","Type":"ContainerStarted","Data":"bacf34c0a49ec768931720041bf24e03192fe9d8fc21f8040157eac56ae9d7da"} Mar 18 16:44:45.233742 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:45.233711 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" event={"ID":"97af0190-d870-4459-8d1a-d160247ed29f","Type":"ContainerStarted","Data":"dd879458d4b51a8a41299afeacf6ca019b1d2af317573c10dbe63a6b09f27408"} Mar 18 16:44:45.236440 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:45.236418 2571 generic.go:358] "Generic (PLEG): container finished" podID="d2c29efd-c4cf-40cc-91bb-1c82e76eea41" containerID="9072a6ddc37362a1a4500416a5394912730e51d37adf778c9b80dc5b66cc60cc" exitCode=0 Mar 18 16:44:45.236536 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:45.236452 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fpm5p" event={"ID":"d2c29efd-c4cf-40cc-91bb-1c82e76eea41","Type":"ContainerDied","Data":"9072a6ddc37362a1a4500416a5394912730e51d37adf778c9b80dc5b66cc60cc"} Mar 18 16:44:45.273570 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:45.273523 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k5jxk" podStartSLOduration=3.877517422 podStartE2EDuration="25.273509546s" podCreationTimestamp="2026-03-18 16:44:20 +0000 UTC" firstStartedPulling="2026-03-18 16:44:22.753759416 +0000 UTC m=+3.283632576" lastFinishedPulling="2026-03-18 16:44:44.149751542 +0000 UTC m=+24.679624700" observedRunningTime="2026-03-18 16:44:45.252386185 +0000 UTC m=+25.782259365" watchObservedRunningTime="2026-03-18 16:44:45.273509546 +0000 UTC m=+25.803382727" Mar 18 16:44:46.121659 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:46.121638 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:46.121773 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:46.121639 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:46.121823 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:46.121769 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jf76n" podUID="bf79a021-e091-4ae2-bd19-2bd1205de781" Mar 18 16:44:46.121858 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:46.121825 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv92n" podUID="3e4d4190-53f5-422e-ba62-0a231b728c8d" Mar 18 16:44:46.240080 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:46.240045 2571 generic.go:358] "Generic (PLEG): container finished" podID="d2c29efd-c4cf-40cc-91bb-1c82e76eea41" containerID="be84b0ff5c7426582b9446482e49d771cfc95b4f703ea601fb2e6e288a1d9435" exitCode=0 Mar 18 16:44:46.240561 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:46.240104 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fpm5p" event={"ID":"d2c29efd-c4cf-40cc-91bb-1c82e76eea41","Type":"ContainerDied","Data":"be84b0ff5c7426582b9446482e49d771cfc95b4f703ea601fb2e6e288a1d9435"} Mar 18 16:44:47.244642 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:47.244402 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" event={"ID":"ee65363f-46b9-4194-8b09-6d6e1a39303b","Type":"ContainerStarted","Data":"0049cfa3c10fc774103d7efef680d64ee1b2c0e0e1f9e811c5debb4f1ae117c6"} Mar 18 16:44:47.244642 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:47.244641 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:47.246580 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:47.246556 2571 generic.go:358] "Generic (PLEG): container finished" podID="d2c29efd-c4cf-40cc-91bb-1c82e76eea41" containerID="03b3a7f753f19a64ae52ed36dc3553728d5317bfd15ff1167500960be5f4b2dc" exitCode=0 Mar 18 16:44:47.246680 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:47.246602 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fpm5p" event={"ID":"d2c29efd-c4cf-40cc-91bb-1c82e76eea41","Type":"ContainerDied","Data":"03b3a7f753f19a64ae52ed36dc3553728d5317bfd15ff1167500960be5f4b2dc"} Mar 18 16:44:47.258852 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:47.258835 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:47.272348 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:47.272311 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" podStartSLOduration=10.378743589 podStartE2EDuration="27.272300537s" podCreationTimestamp="2026-03-18 16:44:20 +0000 UTC" firstStartedPulling="2026-03-18 16:44:22.745432171 +0000 UTC m=+3.275305330" lastFinishedPulling="2026-03-18 16:44:39.63898912 +0000 UTC m=+20.168862278" observedRunningTime="2026-03-18 16:44:47.271198536 +0000 UTC m=+27.801071726" watchObservedRunningTime="2026-03-18 16:44:47.272300537 +0000 UTC m=+27.802173716" Mar 18 16:44:48.121369 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:48.121340 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:48.121536 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:48.121348 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:48.121536 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:48.121508 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jf76n" podUID="bf79a021-e091-4ae2-bd19-2bd1205de781" Mar 18 16:44:48.121638 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:48.121610 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv92n" podUID="3e4d4190-53f5-422e-ba62-0a231b728c8d" Mar 18 16:44:48.249123 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:48.249092 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:48.249123 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:48.249135 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:48.266796 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:48.266771 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:44:48.277423 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:48.277388 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jf76n"] Mar 18 16:44:48.277588 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:48.277558 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:48.277820 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:48.277685 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jf76n" podUID="bf79a021-e091-4ae2-bd19-2bd1205de781" Mar 18 16:44:48.279776 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:48.279528 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qv92n"] Mar 18 16:44:48.279776 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:48.279653 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:48.279776 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:48.279739 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv92n" podUID="3e4d4190-53f5-422e-ba62-0a231b728c8d" Mar 18 16:44:50.122507 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:50.122473 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:50.122938 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:50.122584 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv92n" podUID="3e4d4190-53f5-422e-ba62-0a231b728c8d" Mar 18 16:44:50.122938 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:50.122643 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:50.122938 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:50.122788 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jf76n" podUID="bf79a021-e091-4ae2-bd19-2bd1205de781" Mar 18 16:44:52.121350 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.121098 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:52.121862 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.121168 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:52.121862 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:52.121447 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qv92n" podUID="3e4d4190-53f5-422e-ba62-0a231b728c8d" Mar 18 16:44:52.121862 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:52.121515 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jf76n" podUID="bf79a021-e091-4ae2-bd19-2bd1205de781" Mar 18 16:44:52.759448 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.759419 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-43.ec2.internal" event="NodeReady" Mar 18 16:44:52.759733 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.759540 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Mar 18 16:44:52.802210 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.802179 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zdwcx"] Mar 18 16:44:52.804193 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.804179 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zdwcx" Mar 18 16:44:52.806270 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.806243 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 18 16:44:52.806404 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.806308 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 18 16:44:52.806404 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.806363 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-775g5\"" Mar 18 16:44:52.807439 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.807420 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-98z82"] Mar 18 16:44:52.809466 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.809450 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-98z82" Mar 18 16:44:52.811460 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.811326 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 18 16:44:52.811564 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.811467 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tcvm2\"" Mar 18 16:44:52.811564 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.811518 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 18 16:44:52.812220 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.811628 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 18 16:44:52.815063 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.815045 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zdwcx"] Mar 18 16:44:52.826847 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.826825 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-98z82"] Mar 18 16:44:52.912997 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.912937 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:44:52.913119 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.913052 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47hwv\" (UniqueName: \"kubernetes.io/projected/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-kube-api-access-47hwv\") pod \"ingress-canary-98z82\" (UID: \"54e235df-2ad8-4fbe-81bc-dfb66eafbf2b\") " pod="openshift-ingress-canary/ingress-canary-98z82" Mar 18 16:44:52.913119 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.913103 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bfa6872b-035c-45af-901b-55b2097c2b3d-tmp-dir\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:44:52.913188 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.913120 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2wwn\" (UniqueName: \"kubernetes.io/projected/bfa6872b-035c-45af-901b-55b2097c2b3d-kube-api-access-g2wwn\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:44:52.913237 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.913223 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert\") pod \"ingress-canary-98z82\" (UID: \"54e235df-2ad8-4fbe-81bc-dfb66eafbf2b\") " pod="openshift-ingress-canary/ingress-canary-98z82" Mar 18 16:44:52.913274 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:52.913248 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfa6872b-035c-45af-901b-55b2097c2b3d-config-volume\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:44:53.013996 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:53.013905 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bfa6872b-035c-45af-901b-55b2097c2b3d-tmp-dir\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:44:53.013996 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:53.013940 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2wwn\" (UniqueName: \"kubernetes.io/projected/bfa6872b-035c-45af-901b-55b2097c2b3d-kube-api-access-g2wwn\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:44:53.013996 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:53.013991 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert\") pod \"ingress-canary-98z82\" (UID: \"54e235df-2ad8-4fbe-81bc-dfb66eafbf2b\") " pod="openshift-ingress-canary/ingress-canary-98z82" Mar 18 16:44:53.014246 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:53.014019 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfa6872b-035c-45af-901b-55b2097c2b3d-config-volume\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:44:53.014246 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:53.014049 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:44:53.014246 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:53.014080 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47hwv\" (UniqueName: \"kubernetes.io/projected/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-kube-api-access-47hwv\") pod \"ingress-canary-98z82\" (UID: \"54e235df-2ad8-4fbe-81bc-dfb66eafbf2b\") " pod="openshift-ingress-canary/ingress-canary-98z82" Mar 18 16:44:53.014246 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:53.014164 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:53.014246 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:53.014200 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:53.014508 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:53.014260 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls podName:bfa6872b-035c-45af-901b-55b2097c2b3d nodeName:}" failed. No retries permitted until 2026-03-18 16:44:53.514237489 +0000 UTC m=+34.044110665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls") pod "dns-default-zdwcx" (UID: "bfa6872b-035c-45af-901b-55b2097c2b3d") : secret "dns-default-metrics-tls" not found Mar 18 16:44:53.014508 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:53.014281 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert podName:54e235df-2ad8-4fbe-81bc-dfb66eafbf2b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:53.514271434 +0000 UTC m=+34.044144593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert") pod "ingress-canary-98z82" (UID: "54e235df-2ad8-4fbe-81bc-dfb66eafbf2b") : secret "canary-serving-cert" not found Mar 18 16:44:53.014508 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:53.014335 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bfa6872b-035c-45af-901b-55b2097c2b3d-tmp-dir\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:44:53.015381 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:53.015363 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfa6872b-035c-45af-901b-55b2097c2b3d-config-volume\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:44:53.028069 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:53.028047 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47hwv\" (UniqueName: \"kubernetes.io/projected/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-kube-api-access-47hwv\") pod \"ingress-canary-98z82\" (UID: \"54e235df-2ad8-4fbe-81bc-dfb66eafbf2b\") " pod="openshift-ingress-canary/ingress-canary-98z82" Mar 18 16:44:53.028164 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:53.028104 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2wwn\" (UniqueName: \"kubernetes.io/projected/bfa6872b-035c-45af-901b-55b2097c2b3d-kube-api-access-g2wwn\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:44:53.260347 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:53.260314 2571 generic.go:358] "Generic (PLEG): container finished" podID="d2c29efd-c4cf-40cc-91bb-1c82e76eea41" containerID="407dbce3561c00955142f5524a50c31a38c494d2af295ba8e71a10d02bc2bd81" exitCode=0 Mar 18 16:44:53.261006 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:53.260357 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fpm5p" event={"ID":"d2c29efd-c4cf-40cc-91bb-1c82e76eea41","Type":"ContainerDied","Data":"407dbce3561c00955142f5524a50c31a38c494d2af295ba8e71a10d02bc2bd81"} Mar 18 16:44:53.518859 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:53.518772 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert\") pod \"ingress-canary-98z82\" (UID: \"54e235df-2ad8-4fbe-81bc-dfb66eafbf2b\") " pod="openshift-ingress-canary/ingress-canary-98z82" Mar 18 16:44:53.518859 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:53.518814 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:44:53.519061 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:53.518909 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:53.519061 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:53.518988 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert podName:54e235df-2ad8-4fbe-81bc-dfb66eafbf2b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:54.518956861 +0000 UTC m=+35.048830019 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert") pod "ingress-canary-98z82" (UID: "54e235df-2ad8-4fbe-81bc-dfb66eafbf2b") : secret "canary-serving-cert" not found Mar 18 16:44:53.519061 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:53.519003 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:53.519061 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:53.519042 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls podName:bfa6872b-035c-45af-901b-55b2097c2b3d nodeName:}" failed. No retries permitted until 2026-03-18 16:44:54.519030578 +0000 UTC m=+35.048903736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls") pod "dns-default-zdwcx" (UID: "bfa6872b-035c-45af-901b-55b2097c2b3d") : secret "dns-default-metrics-tls" not found Mar 18 16:44:53.720196 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:53.720156 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs\") pod \"network-metrics-daemon-jf76n\" (UID: \"bf79a021-e091-4ae2-bd19-2bd1205de781\") " pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:53.720364 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:53.720286 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:53.720364 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:53.720345 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs podName:bf79a021-e091-4ae2-bd19-2bd1205de781 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:25.720331588 +0000 UTC m=+66.250204746 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs") pod "network-metrics-daemon-jf76n" (UID: "bf79a021-e091-4ae2-bd19-2bd1205de781") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:53.821148 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:53.821076 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxw2q\" (UniqueName: \"kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q\") pod \"network-check-target-qv92n\" (UID: \"3e4d4190-53f5-422e-ba62-0a231b728c8d\") " pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:53.821292 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:53.821270 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:53.821352 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:53.821299 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:53.821352 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:53.821313 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mxw2q for pod openshift-network-diagnostics/network-check-target-qv92n: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:53.821444 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:53.821379 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q podName:3e4d4190-53f5-422e-ba62-0a231b728c8d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:25.821360386 +0000 UTC m=+66.351233558 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-mxw2q" (UniqueName: "kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q") pod "network-check-target-qv92n" (UID: "3e4d4190-53f5-422e-ba62-0a231b728c8d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:54.121615 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:54.121534 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:44:54.121766 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:54.121536 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:44:54.124022 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:54.124002 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:44:54.124247 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:54.124226 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gnx58\"" Mar 18 16:44:54.124247 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:54.124243 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:44:54.124400 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:54.124229 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:44:54.124400 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:54.124307 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vrjl4\"" Mar 18 16:44:54.264989 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:54.264933 2571 generic.go:358] "Generic (PLEG): container finished" podID="d2c29efd-c4cf-40cc-91bb-1c82e76eea41" containerID="448b245da962f8e065739ba8da99a1dd625b67a33329c1a335a51def050cf600" exitCode=0 Mar 18 16:44:54.265349 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:54.265019 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fpm5p" event={"ID":"d2c29efd-c4cf-40cc-91bb-1c82e76eea41","Type":"ContainerDied","Data":"448b245da962f8e065739ba8da99a1dd625b67a33329c1a335a51def050cf600"} Mar 18 16:44:54.527027 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:54.526991 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert\") pod \"ingress-canary-98z82\" (UID: \"54e235df-2ad8-4fbe-81bc-dfb66eafbf2b\") " pod="openshift-ingress-canary/ingress-canary-98z82" Mar 18 16:44:54.527173 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:54.527037 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:44:54.527173 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:54.527078 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:54.527173 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:54.527144 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert podName:54e235df-2ad8-4fbe-81bc-dfb66eafbf2b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:56.527127561 +0000 UTC m=+37.057000722 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert") pod "ingress-canary-98z82" (UID: "54e235df-2ad8-4fbe-81bc-dfb66eafbf2b") : secret "canary-serving-cert" not found Mar 18 16:44:54.527173 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:54.527156 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:54.527320 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:54.527191 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls podName:bfa6872b-035c-45af-901b-55b2097c2b3d nodeName:}" failed. No retries permitted until 2026-03-18 16:44:56.52718132 +0000 UTC m=+37.057054479 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls") pod "dns-default-zdwcx" (UID: "bfa6872b-035c-45af-901b-55b2097c2b3d") : secret "dns-default-metrics-tls" not found Mar 18 16:44:55.270389 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:55.270360 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fpm5p" event={"ID":"d2c29efd-c4cf-40cc-91bb-1c82e76eea41","Type":"ContainerStarted","Data":"cda9201aca152274238ee4e2ccc8ecad909cf8037c49fc5e4af1fe831e1cf9cf"} Mar 18 16:44:55.296513 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:55.296464 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fpm5p" podStartSLOduration=5.20561972 podStartE2EDuration="35.296451251s" podCreationTimestamp="2026-03-18 16:44:20 +0000 UTC" firstStartedPulling="2026-03-18 16:44:22.749326571 +0000 UTC m=+3.279199733" lastFinishedPulling="2026-03-18 16:44:52.840158102 +0000 UTC m=+33.370031264" observedRunningTime="2026-03-18 16:44:55.295265501 +0000 UTC m=+35.825138681" watchObservedRunningTime="2026-03-18 16:44:55.296451251 +0000 UTC m=+35.826324431" Mar 18 16:44:56.545560 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:56.545519 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert\") pod \"ingress-canary-98z82\" (UID: \"54e235df-2ad8-4fbe-81bc-dfb66eafbf2b\") " pod="openshift-ingress-canary/ingress-canary-98z82" Mar 18 16:44:56.546024 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:44:56.545572 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:44:56.546024 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:56.545681 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:56.546024 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:56.545762 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert podName:54e235df-2ad8-4fbe-81bc-dfb66eafbf2b nodeName:}" failed. No retries permitted until 2026-03-18 16:45:00.545744326 +0000 UTC m=+41.075617485 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert") pod "ingress-canary-98z82" (UID: "54e235df-2ad8-4fbe-81bc-dfb66eafbf2b") : secret "canary-serving-cert" not found Mar 18 16:44:56.546024 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:56.545687 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:56.546024 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:44:56.545914 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls podName:bfa6872b-035c-45af-901b-55b2097c2b3d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:00.545897238 +0000 UTC m=+41.075770398 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls") pod "dns-default-zdwcx" (UID: "bfa6872b-035c-45af-901b-55b2097c2b3d") : secret "dns-default-metrics-tls" not found Mar 18 16:45:00.576165 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:00.576128 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert\") pod \"ingress-canary-98z82\" (UID: \"54e235df-2ad8-4fbe-81bc-dfb66eafbf2b\") " pod="openshift-ingress-canary/ingress-canary-98z82" Mar 18 16:45:00.576165 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:00.576169 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:45:00.576567 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:45:00.576276 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:00.576567 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:45:00.576343 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert podName:54e235df-2ad8-4fbe-81bc-dfb66eafbf2b nodeName:}" failed. No retries permitted until 2026-03-18 16:45:08.576327482 +0000 UTC m=+49.106200640 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert") pod "ingress-canary-98z82" (UID: "54e235df-2ad8-4fbe-81bc-dfb66eafbf2b") : secret "canary-serving-cert" not found Mar 18 16:45:00.576567 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:45:00.576288 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:00.576567 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:45:00.576376 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls podName:bfa6872b-035c-45af-901b-55b2097c2b3d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:08.576369815 +0000 UTC m=+49.106242973 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls") pod "dns-default-zdwcx" (UID: "bfa6872b-035c-45af-901b-55b2097c2b3d") : secret "dns-default-metrics-tls" not found Mar 18 16:45:08.635500 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:08.635462 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:45:08.636024 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:45:08.635626 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:08.636024 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:08.635633 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert\") pod \"ingress-canary-98z82\" (UID: \"54e235df-2ad8-4fbe-81bc-dfb66eafbf2b\") " pod="openshift-ingress-canary/ingress-canary-98z82" Mar 18 16:45:08.636024 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:45:08.635692 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:08.636024 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:45:08.635697 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls podName:bfa6872b-035c-45af-901b-55b2097c2b3d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:24.635676387 +0000 UTC m=+65.165549566 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls") pod "dns-default-zdwcx" (UID: "bfa6872b-035c-45af-901b-55b2097c2b3d") : secret "dns-default-metrics-tls" not found Mar 18 16:45:08.636024 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:45:08.635754 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert podName:54e235df-2ad8-4fbe-81bc-dfb66eafbf2b nodeName:}" failed. No retries permitted until 2026-03-18 16:45:24.635743436 +0000 UTC m=+65.165616600 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert") pod "ingress-canary-98z82" (UID: "54e235df-2ad8-4fbe-81bc-dfb66eafbf2b") : secret "canary-serving-cert" not found Mar 18 16:45:20.263594 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:20.263564 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m9kqz" Mar 18 16:45:24.642744 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:24.642706 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert\") pod \"ingress-canary-98z82\" (UID: \"54e235df-2ad8-4fbe-81bc-dfb66eafbf2b\") " pod="openshift-ingress-canary/ingress-canary-98z82" Mar 18 16:45:24.642744 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:24.642752 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:45:24.643169 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:45:24.642864 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:24.643169 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:45:24.642905 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:24.643169 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:45:24.642941 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert podName:54e235df-2ad8-4fbe-81bc-dfb66eafbf2b nodeName:}" failed. No retries permitted until 2026-03-18 16:45:56.642921768 +0000 UTC m=+97.172794927 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert") pod "ingress-canary-98z82" (UID: "54e235df-2ad8-4fbe-81bc-dfb66eafbf2b") : secret "canary-serving-cert" not found Mar 18 16:45:24.643169 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:45:24.642966 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls podName:bfa6872b-035c-45af-901b-55b2097c2b3d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:56.642955667 +0000 UTC m=+97.172828831 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls") pod "dns-default-zdwcx" (UID: "bfa6872b-035c-45af-901b-55b2097c2b3d") : secret "dns-default-metrics-tls" not found Mar 18 16:45:25.749207 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:25.749164 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs\") pod \"network-metrics-daemon-jf76n\" (UID: \"bf79a021-e091-4ae2-bd19-2bd1205de781\") " pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:45:25.751387 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:25.751371 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:45:25.760408 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:45:25.760380 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:45:25.760475 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:45:25.760451 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs podName:bf79a021-e091-4ae2-bd19-2bd1205de781 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:29.76043189 +0000 UTC m=+130.290305053 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs") pod "network-metrics-daemon-jf76n" (UID: "bf79a021-e091-4ae2-bd19-2bd1205de781") : secret "metrics-daemon-secret" not found Mar 18 16:45:25.850403 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:25.850370 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxw2q\" (UniqueName: \"kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q\") pod \"network-check-target-qv92n\" (UID: \"3e4d4190-53f5-422e-ba62-0a231b728c8d\") " pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:45:25.852732 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:25.852711 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:45:25.863220 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:25.863202 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:45:25.874078 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:25.874058 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxw2q\" (UniqueName: \"kubernetes.io/projected/3e4d4190-53f5-422e-ba62-0a231b728c8d-kube-api-access-mxw2q\") pod \"network-check-target-qv92n\" (UID: \"3e4d4190-53f5-422e-ba62-0a231b728c8d\") " pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:45:25.934438 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:25.934409 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gnx58\"" Mar 18 16:45:25.943038 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:25.943013 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:45:26.099428 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:26.099395 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qv92n"] Mar 18 16:45:26.104589 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:45:26.104558 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e4d4190_53f5_422e_ba62_0a231b728c8d.slice/crio-2d5ce3c7b1fca8e8260cde466f444b311c6efc40a0a9d9bd19b9ff8d1be65859 WatchSource:0}: Error finding container 2d5ce3c7b1fca8e8260cde466f444b311c6efc40a0a9d9bd19b9ff8d1be65859: Status 404 returned error can't find the container with id 2d5ce3c7b1fca8e8260cde466f444b311c6efc40a0a9d9bd19b9ff8d1be65859 Mar 18 16:45:26.329266 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:26.329185 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qv92n" event={"ID":"3e4d4190-53f5-422e-ba62-0a231b728c8d","Type":"ContainerStarted","Data":"2d5ce3c7b1fca8e8260cde466f444b311c6efc40a0a9d9bd19b9ff8d1be65859"} Mar 18 16:45:29.336090 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:29.336046 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qv92n" event={"ID":"3e4d4190-53f5-422e-ba62-0a231b728c8d","Type":"ContainerStarted","Data":"a05980e9ec026e6ec69a90a4b46a1c22295e8fbc5d144cab6c18e31c56ccfa0c"} Mar 18 16:45:29.336525 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:29.336269 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:45:29.351189 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:29.351071 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qv92n" podStartSLOduration=66.666890637 podStartE2EDuration="1m9.351054062s" podCreationTimestamp="2026-03-18 16:44:20 +0000 UTC" firstStartedPulling="2026-03-18 16:45:26.108321753 +0000 UTC m=+66.638194925" lastFinishedPulling="2026-03-18 16:45:28.792485191 +0000 UTC m=+69.322358350" observedRunningTime="2026-03-18 16:45:29.350814161 +0000 UTC m=+69.880687340" watchObservedRunningTime="2026-03-18 16:45:29.351054062 +0000 UTC m=+69.880927239" Mar 18 16:45:56.658325 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:56.658270 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert\") pod \"ingress-canary-98z82\" (UID: \"54e235df-2ad8-4fbe-81bc-dfb66eafbf2b\") " pod="openshift-ingress-canary/ingress-canary-98z82" Mar 18 16:45:56.658325 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:45:56.658333 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:45:56.658871 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:45:56.658430 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:56.658871 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:45:56.658499 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert podName:54e235df-2ad8-4fbe-81bc-dfb66eafbf2b nodeName:}" failed. No retries permitted until 2026-03-18 16:47:00.658482336 +0000 UTC m=+161.188355499 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert") pod "ingress-canary-98z82" (UID: "54e235df-2ad8-4fbe-81bc-dfb66eafbf2b") : secret "canary-serving-cert" not found Mar 18 16:45:56.658871 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:45:56.658433 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:56.658871 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:45:56.658572 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls podName:bfa6872b-035c-45af-901b-55b2097c2b3d nodeName:}" failed. No retries permitted until 2026-03-18 16:47:00.658559425 +0000 UTC m=+161.188432583 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls") pod "dns-default-zdwcx" (UID: "bfa6872b-035c-45af-901b-55b2097c2b3d") : secret "dns-default-metrics-tls" not found Mar 18 16:46:00.340265 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:00.340233 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qv92n" Mar 18 16:46:29.785496 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:29.785441 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs\") pod \"network-metrics-daemon-jf76n\" (UID: \"bf79a021-e091-4ae2-bd19-2bd1205de781\") " pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:46:29.786021 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:29.785591 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:46:29.786021 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:29.785669 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs podName:bf79a021-e091-4ae2-bd19-2bd1205de781 nodeName:}" failed. No retries permitted until 2026-03-18 16:48:31.785650751 +0000 UTC m=+252.315523910 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs") pod "network-metrics-daemon-jf76n" (UID: "bf79a021-e091-4ae2-bd19-2bd1205de781") : secret "metrics-daemon-secret" not found Mar 18 16:46:32.760306 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.760274 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss"] Mar 18 16:46:32.763041 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.763011 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss" Mar 18 16:46:32.763288 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.763264 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-6785879c4-rgndg"] Mar 18 16:46:32.766250 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.766217 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:32.767358 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.767315 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Mar 18 16:46:32.767358 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.767338 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Mar 18 16:46:32.767358 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.767339 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:32.767552 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.767451 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-nwvw9\"" Mar 18 16:46:32.768367 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.768353 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Mar 18 16:46:32.768458 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.768441 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Mar 18 16:46:32.769077 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.769061 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Mar 18 16:46:32.769870 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.769857 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Mar 18 16:46:32.770039 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.770024 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Mar 18 16:46:32.770113 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.770055 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Mar 18 16:46:32.770113 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.770072 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-rqj44\"" Mar 18 16:46:32.798620 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.798590 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss"] Mar 18 16:46:32.799240 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.799223 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6785879c4-rgndg"] Mar 18 16:46:32.874622 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.874581 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt"] Mar 18 16:46:32.876874 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.876852 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf"] Mar 18 16:46:32.877045 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.877024 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" Mar 18 16:46:32.878835 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.878814 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g"] Mar 18 16:46:32.878957 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.878942 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf" Mar 18 16:46:32.880570 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.880550 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Mar 18 16:46:32.880673 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.880577 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Mar 18 16:46:32.880813 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.880799 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Mar 18 16:46:32.881024 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.880828 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Mar 18 16:46:32.881024 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.880914 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g" Mar 18 16:46:32.882376 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.882154 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:32.882376 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.882216 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Mar 18 16:46:32.882376 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.882233 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-bt59t\"" Mar 18 16:46:32.882642 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.882473 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-4fvmk\"" Mar 18 16:46:32.882642 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.882520 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Mar 18 16:46:32.882745 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.882704 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Mar 18 16:46:32.884408 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.884387 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-ptbml\"" Mar 18 16:46:32.884511 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.884429 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Mar 18 16:46:32.884511 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.884441 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:32.884511 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.884463 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Mar 18 16:46:32.884511 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.884446 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Mar 18 16:46:32.893298 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.893272 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt"] Mar 18 16:46:32.895301 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.895283 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf"] Mar 18 16:46:32.904452 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.904426 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndzbt\" (UniqueName: \"kubernetes.io/projected/d4b49dc5-e716-48c1-afae-5d4eeba82433-kube-api-access-ndzbt\") pod \"cluster-samples-operator-d5df4776c-n88ss\" (UID: \"d4b49dc5-e716-48c1-afae-5d4eeba82433\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss" Mar 18 16:46:32.904638 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.904616 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-default-certificate\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:32.904777 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.904759 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-stats-auth\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:32.905005 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.904941 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr2xp\" (UniqueName: \"kubernetes.io/projected/09d9a175-36dd-4900-92c2-7b9a0986e68d-kube-api-access-vr2xp\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:32.905193 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.905171 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:32.905299 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.905221 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4b49dc5-e716-48c1-afae-5d4eeba82433-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-n88ss\" (UID: \"d4b49dc5-e716-48c1-afae-5d4eeba82433\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss" Mar 18 16:46:32.905367 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.905313 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-metrics-certs\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:32.906693 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:32.906672 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g"] Mar 18 16:46:33.006266 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.006222 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b1e3b5-f3de-4e5a-855d-d72d883b476f-config\") pod \"kube-storage-version-migrator-operator-866f46547-mr5cf\" (UID: \"71b1e3b5-f3de-4e5a-855d-d72d883b476f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf" Mar 18 16:46:33.006266 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.006266 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92e8b58c-5235-4208-9c7e-f4ff8c45861a-serving-cert\") pod \"service-ca-operator-56f6f4cbcb-lw72g\" (UID: \"92e8b58c-5235-4208-9c7e-f4ff8c45861a\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g" Mar 18 16:46:33.006505 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.006318 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndzbt\" (UniqueName: \"kubernetes.io/projected/d4b49dc5-e716-48c1-afae-5d4eeba82433-kube-api-access-ndzbt\") pod \"cluster-samples-operator-d5df4776c-n88ss\" (UID: \"d4b49dc5-e716-48c1-afae-5d4eeba82433\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss" Mar 18 16:46:33.006505 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.006341 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-default-certificate\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:33.006505 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.006363 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-stats-auth\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:33.006505 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.006386 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vr2xp\" (UniqueName: \"kubernetes.io/projected/09d9a175-36dd-4900-92c2-7b9a0986e68d-kube-api-access-vr2xp\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:33.006505 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.006412 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j2wp\" (UniqueName: \"kubernetes.io/projected/f0e93606-444c-4b6a-8294-6603a2b534e8-kube-api-access-4j2wp\") pod \"cluster-monitoring-operator-b58cd5d8d-2t9zt\" (UID: \"f0e93606-444c-4b6a-8294-6603a2b534e8\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" Mar 18 16:46:33.006505 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.006444 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e8b58c-5235-4208-9c7e-f4ff8c45861a-config\") pod \"service-ca-operator-56f6f4cbcb-lw72g\" (UID: \"92e8b58c-5235-4208-9c7e-f4ff8c45861a\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g" Mar 18 16:46:33.006505 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.006466 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f0e93606-444c-4b6a-8294-6603a2b534e8-telemetry-config\") pod \"cluster-monitoring-operator-b58cd5d8d-2t9zt\" (UID: \"f0e93606-444c-4b6a-8294-6603a2b534e8\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" Mar 18 16:46:33.006505 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.006492 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:33.006820 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.006537 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxdc8\" (UniqueName: \"kubernetes.io/projected/71b1e3b5-f3de-4e5a-855d-d72d883b476f-kube-api-access-rxdc8\") pod \"kube-storage-version-migrator-operator-866f46547-mr5cf\" (UID: \"71b1e3b5-f3de-4e5a-855d-d72d883b476f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf" Mar 18 16:46:33.006820 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.006602 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4b49dc5-e716-48c1-afae-5d4eeba82433-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-n88ss\" (UID: \"d4b49dc5-e716-48c1-afae-5d4eeba82433\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss" Mar 18 16:46:33.006820 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:33.006665 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle podName:09d9a175-36dd-4900-92c2-7b9a0986e68d nodeName:}" failed. No retries permitted until 2026-03-18 16:46:33.506614766 +0000 UTC m=+134.036487947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle") pod "router-default-6785879c4-rgndg" (UID: "09d9a175-36dd-4900-92c2-7b9a0986e68d") : configmap references non-existent config key: service-ca.crt Mar 18 16:46:33.006820 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:33.006681 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:46:33.006820 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:33.006723 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4b49dc5-e716-48c1-afae-5d4eeba82433-samples-operator-tls podName:d4b49dc5-e716-48c1-afae-5d4eeba82433 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:33.506712564 +0000 UTC m=+134.036585725 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d4b49dc5-e716-48c1-afae-5d4eeba82433-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-n88ss" (UID: "d4b49dc5-e716-48c1-afae-5d4eeba82433") : secret "samples-operator-tls" not found Mar 18 16:46:33.006820 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.006744 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-2t9zt\" (UID: \"f0e93606-444c-4b6a-8294-6603a2b534e8\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" Mar 18 16:46:33.006820 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.006794 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71b1e3b5-f3de-4e5a-855d-d72d883b476f-serving-cert\") pod \"kube-storage-version-migrator-operator-866f46547-mr5cf\" (UID: \"71b1e3b5-f3de-4e5a-855d-d72d883b476f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf" Mar 18 16:46:33.007110 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.006851 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-metrics-certs\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:33.007110 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.006897 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmqtq\" (UniqueName: \"kubernetes.io/projected/92e8b58c-5235-4208-9c7e-f4ff8c45861a-kube-api-access-kmqtq\") pod \"service-ca-operator-56f6f4cbcb-lw72g\" (UID: \"92e8b58c-5235-4208-9c7e-f4ff8c45861a\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g" Mar 18 16:46:33.007110 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:33.006925 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:46:33.007110 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:33.007076 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-metrics-certs podName:09d9a175-36dd-4900-92c2-7b9a0986e68d nodeName:}" failed. No retries permitted until 2026-03-18 16:46:33.507061205 +0000 UTC m=+134.036934365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-metrics-certs") pod "router-default-6785879c4-rgndg" (UID: "09d9a175-36dd-4900-92c2-7b9a0986e68d") : secret "router-metrics-certs-default" not found Mar 18 16:46:33.008837 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.008812 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-stats-auth\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:33.008962 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.008942 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-default-certificate\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:33.014786 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.014727 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndzbt\" (UniqueName: \"kubernetes.io/projected/d4b49dc5-e716-48c1-afae-5d4eeba82433-kube-api-access-ndzbt\") pod \"cluster-samples-operator-d5df4776c-n88ss\" (UID: \"d4b49dc5-e716-48c1-afae-5d4eeba82433\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss" Mar 18 16:46:33.015065 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.015050 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr2xp\" (UniqueName: \"kubernetes.io/projected/09d9a175-36dd-4900-92c2-7b9a0986e68d-kube-api-access-vr2xp\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:33.107404 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.107365 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71b1e3b5-f3de-4e5a-855d-d72d883b476f-serving-cert\") pod \"kube-storage-version-migrator-operator-866f46547-mr5cf\" (UID: \"71b1e3b5-f3de-4e5a-855d-d72d883b476f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf" Mar 18 16:46:33.107598 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.107435 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmqtq\" (UniqueName: \"kubernetes.io/projected/92e8b58c-5235-4208-9c7e-f4ff8c45861a-kube-api-access-kmqtq\") pod \"service-ca-operator-56f6f4cbcb-lw72g\" (UID: \"92e8b58c-5235-4208-9c7e-f4ff8c45861a\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g" Mar 18 16:46:33.107598 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.107480 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b1e3b5-f3de-4e5a-855d-d72d883b476f-config\") pod \"kube-storage-version-migrator-operator-866f46547-mr5cf\" (UID: \"71b1e3b5-f3de-4e5a-855d-d72d883b476f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf" Mar 18 16:46:33.107598 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.107507 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92e8b58c-5235-4208-9c7e-f4ff8c45861a-serving-cert\") pod \"service-ca-operator-56f6f4cbcb-lw72g\" (UID: \"92e8b58c-5235-4208-9c7e-f4ff8c45861a\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g" Mar 18 16:46:33.107598 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.107541 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4j2wp\" (UniqueName: \"kubernetes.io/projected/f0e93606-444c-4b6a-8294-6603a2b534e8-kube-api-access-4j2wp\") pod \"cluster-monitoring-operator-b58cd5d8d-2t9zt\" (UID: \"f0e93606-444c-4b6a-8294-6603a2b534e8\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" Mar 18 16:46:33.107598 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.107569 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e8b58c-5235-4208-9c7e-f4ff8c45861a-config\") pod \"service-ca-operator-56f6f4cbcb-lw72g\" (UID: \"92e8b58c-5235-4208-9c7e-f4ff8c45861a\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g" Mar 18 16:46:33.107598 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.107592 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f0e93606-444c-4b6a-8294-6603a2b534e8-telemetry-config\") pod \"cluster-monitoring-operator-b58cd5d8d-2t9zt\" (UID: \"f0e93606-444c-4b6a-8294-6603a2b534e8\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" Mar 18 16:46:33.107904 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.107654 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxdc8\" (UniqueName: \"kubernetes.io/projected/71b1e3b5-f3de-4e5a-855d-d72d883b476f-kube-api-access-rxdc8\") pod \"kube-storage-version-migrator-operator-866f46547-mr5cf\" (UID: \"71b1e3b5-f3de-4e5a-855d-d72d883b476f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf" Mar 18 16:46:33.107904 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.107718 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-2t9zt\" (UID: \"f0e93606-444c-4b6a-8294-6603a2b534e8\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" Mar 18 16:46:33.107904 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:33.107821 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:33.107904 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:33.107885 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls podName:f0e93606-444c-4b6a-8294-6603a2b534e8 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:33.607864029 +0000 UTC m=+134.137737193 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-2t9zt" (UID: "f0e93606-444c-4b6a-8294-6603a2b534e8") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:33.108166 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.108144 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b1e3b5-f3de-4e5a-855d-d72d883b476f-config\") pod \"kube-storage-version-migrator-operator-866f46547-mr5cf\" (UID: \"71b1e3b5-f3de-4e5a-855d-d72d883b476f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf" Mar 18 16:46:33.108390 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.108367 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f0e93606-444c-4b6a-8294-6603a2b534e8-telemetry-config\") pod \"cluster-monitoring-operator-b58cd5d8d-2t9zt\" (UID: \"f0e93606-444c-4b6a-8294-6603a2b534e8\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" Mar 18 16:46:33.108924 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.108900 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e8b58c-5235-4208-9c7e-f4ff8c45861a-config\") pod \"service-ca-operator-56f6f4cbcb-lw72g\" (UID: \"92e8b58c-5235-4208-9c7e-f4ff8c45861a\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g" Mar 18 16:46:33.109878 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.109851 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92e8b58c-5235-4208-9c7e-f4ff8c45861a-serving-cert\") pod \"service-ca-operator-56f6f4cbcb-lw72g\" (UID: \"92e8b58c-5235-4208-9c7e-f4ff8c45861a\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g" Mar 18 16:46:33.109992 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.109958 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71b1e3b5-f3de-4e5a-855d-d72d883b476f-serving-cert\") pod \"kube-storage-version-migrator-operator-866f46547-mr5cf\" (UID: \"71b1e3b5-f3de-4e5a-855d-d72d883b476f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf" Mar 18 16:46:33.117555 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.117532 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxdc8\" (UniqueName: \"kubernetes.io/projected/71b1e3b5-f3de-4e5a-855d-d72d883b476f-kube-api-access-rxdc8\") pod \"kube-storage-version-migrator-operator-866f46547-mr5cf\" (UID: \"71b1e3b5-f3de-4e5a-855d-d72d883b476f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf" Mar 18 16:46:33.117639 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.117621 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmqtq\" (UniqueName: \"kubernetes.io/projected/92e8b58c-5235-4208-9c7e-f4ff8c45861a-kube-api-access-kmqtq\") pod \"service-ca-operator-56f6f4cbcb-lw72g\" (UID: \"92e8b58c-5235-4208-9c7e-f4ff8c45861a\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g" Mar 18 16:46:33.117683 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.117647 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j2wp\" (UniqueName: \"kubernetes.io/projected/f0e93606-444c-4b6a-8294-6603a2b534e8-kube-api-access-4j2wp\") pod \"cluster-monitoring-operator-b58cd5d8d-2t9zt\" (UID: \"f0e93606-444c-4b6a-8294-6603a2b534e8\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" Mar 18 16:46:33.193151 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.193117 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf" Mar 18 16:46:33.198836 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.198812 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g" Mar 18 16:46:33.312153 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.312111 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf"] Mar 18 16:46:33.316269 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:46:33.316243 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71b1e3b5_f3de_4e5a_855d_d72d883b476f.slice/crio-fc5a224541e2a2356e35b90d5643350725648aabc4c4bd793beaf2cc76d52cad WatchSource:0}: Error finding container fc5a224541e2a2356e35b90d5643350725648aabc4c4bd793beaf2cc76d52cad: Status 404 returned error can't find the container with id fc5a224541e2a2356e35b90d5643350725648aabc4c4bd793beaf2cc76d52cad Mar 18 16:46:33.325452 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.325427 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g"] Mar 18 16:46:33.328169 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:46:33.328144 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92e8b58c_5235_4208_9c7e_f4ff8c45861a.slice/crio-9d3894effaf3b352eac0471ac9961b8b480873c097c35efef14a577a9348472c WatchSource:0}: Error finding container 9d3894effaf3b352eac0471ac9961b8b480873c097c35efef14a577a9348472c: Status 404 returned error can't find the container with id 9d3894effaf3b352eac0471ac9961b8b480873c097c35efef14a577a9348472c Mar 18 16:46:33.452639 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.452604 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g" event={"ID":"92e8b58c-5235-4208-9c7e-f4ff8c45861a","Type":"ContainerStarted","Data":"9d3894effaf3b352eac0471ac9961b8b480873c097c35efef14a577a9348472c"} Mar 18 16:46:33.453472 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.453450 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf" event={"ID":"71b1e3b5-f3de-4e5a-855d-d72d883b476f","Type":"ContainerStarted","Data":"fc5a224541e2a2356e35b90d5643350725648aabc4c4bd793beaf2cc76d52cad"} Mar 18 16:46:33.511196 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.511164 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:33.511350 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.511207 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4b49dc5-e716-48c1-afae-5d4eeba82433-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-n88ss\" (UID: \"d4b49dc5-e716-48c1-afae-5d4eeba82433\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss" Mar 18 16:46:33.511350 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.511256 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-metrics-certs\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:33.511350 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:33.511330 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle podName:09d9a175-36dd-4900-92c2-7b9a0986e68d nodeName:}" failed. No retries permitted until 2026-03-18 16:46:34.511311496 +0000 UTC m=+135.041184674 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle") pod "router-default-6785879c4-rgndg" (UID: "09d9a175-36dd-4900-92c2-7b9a0986e68d") : configmap references non-existent config key: service-ca.crt Mar 18 16:46:33.511456 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:33.511354 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:46:33.511456 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:33.511375 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:46:33.511456 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:33.511393 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-metrics-certs podName:09d9a175-36dd-4900-92c2-7b9a0986e68d nodeName:}" failed. No retries permitted until 2026-03-18 16:46:34.511381492 +0000 UTC m=+135.041254650 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-metrics-certs") pod "router-default-6785879c4-rgndg" (UID: "09d9a175-36dd-4900-92c2-7b9a0986e68d") : secret "router-metrics-certs-default" not found Mar 18 16:46:33.511456 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:33.511430 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4b49dc5-e716-48c1-afae-5d4eeba82433-samples-operator-tls podName:d4b49dc5-e716-48c1-afae-5d4eeba82433 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:34.511418623 +0000 UTC m=+135.041291783 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d4b49dc5-e716-48c1-afae-5d4eeba82433-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-n88ss" (UID: "d4b49dc5-e716-48c1-afae-5d4eeba82433") : secret "samples-operator-tls" not found Mar 18 16:46:33.612185 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:33.612081 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-2t9zt\" (UID: \"f0e93606-444c-4b6a-8294-6603a2b534e8\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" Mar 18 16:46:33.612341 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:33.612222 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:33.612341 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:33.612290 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls podName:f0e93606-444c-4b6a-8294-6603a2b534e8 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:34.612271782 +0000 UTC m=+135.142144941 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-2t9zt" (UID: "f0e93606-444c-4b6a-8294-6603a2b534e8") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:34.519379 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:34.519334 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-metrics-certs\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:34.519807 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:34.519438 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:34.519807 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:34.519479 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4b49dc5-e716-48c1-afae-5d4eeba82433-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-n88ss\" (UID: \"d4b49dc5-e716-48c1-afae-5d4eeba82433\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss" Mar 18 16:46:34.519807 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:34.519491 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:46:34.519807 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:34.519560 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-metrics-certs podName:09d9a175-36dd-4900-92c2-7b9a0986e68d nodeName:}" failed. No retries permitted until 2026-03-18 16:46:36.519545008 +0000 UTC m=+137.049418171 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-metrics-certs") pod "router-default-6785879c4-rgndg" (UID: "09d9a175-36dd-4900-92c2-7b9a0986e68d") : secret "router-metrics-certs-default" not found Mar 18 16:46:34.519807 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:34.519614 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:46:34.519807 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:34.519619 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle podName:09d9a175-36dd-4900-92c2-7b9a0986e68d nodeName:}" failed. No retries permitted until 2026-03-18 16:46:36.519600367 +0000 UTC m=+137.049473528 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle") pod "router-default-6785879c4-rgndg" (UID: "09d9a175-36dd-4900-92c2-7b9a0986e68d") : configmap references non-existent config key: service-ca.crt Mar 18 16:46:34.519807 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:34.519661 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4b49dc5-e716-48c1-afae-5d4eeba82433-samples-operator-tls podName:d4b49dc5-e716-48c1-afae-5d4eeba82433 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:36.519646933 +0000 UTC m=+137.049520093 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d4b49dc5-e716-48c1-afae-5d4eeba82433-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-n88ss" (UID: "d4b49dc5-e716-48c1-afae-5d4eeba82433") : secret "samples-operator-tls" not found Mar 18 16:46:34.620500 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:34.620456 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-2t9zt\" (UID: \"f0e93606-444c-4b6a-8294-6603a2b534e8\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" Mar 18 16:46:34.620685 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:34.620579 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:34.620685 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:34.620646 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls podName:f0e93606-444c-4b6a-8294-6603a2b534e8 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:36.620627628 +0000 UTC m=+137.150500791 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-2t9zt" (UID: "f0e93606-444c-4b6a-8294-6603a2b534e8") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:36.461105 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:36.461065 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g" event={"ID":"92e8b58c-5235-4208-9c7e-f4ff8c45861a","Type":"ContainerStarted","Data":"d5d2aa21d30b03a93a7fef2749c1a7e8bc89c76c912d8e2b1f3051e437d9ebc1"} Mar 18 16:46:36.462361 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:36.462334 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf" event={"ID":"71b1e3b5-f3de-4e5a-855d-d72d883b476f","Type":"ContainerStarted","Data":"32c79ecb18554e4237832b264fef3940131a693072f06a3409385ceae0f705a2"} Mar 18 16:46:36.477079 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:36.477032 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g" podStartSLOduration=2.137098305 podStartE2EDuration="4.477017485s" podCreationTimestamp="2026-03-18 16:46:32 +0000 UTC" firstStartedPulling="2026-03-18 16:46:33.329769729 +0000 UTC m=+133.859642889" lastFinishedPulling="2026-03-18 16:46:35.669688911 +0000 UTC m=+136.199562069" observedRunningTime="2026-03-18 16:46:36.476327623 +0000 UTC m=+137.006200805" watchObservedRunningTime="2026-03-18 16:46:36.477017485 +0000 UTC m=+137.006890666" Mar 18 16:46:36.491299 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:36.491243 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf" podStartSLOduration=2.140431038 podStartE2EDuration="4.491225823s" podCreationTimestamp="2026-03-18 16:46:32 +0000 UTC" firstStartedPulling="2026-03-18 16:46:33.319837113 +0000 UTC m=+133.849710273" lastFinishedPulling="2026-03-18 16:46:35.670631895 +0000 UTC m=+136.200505058" observedRunningTime="2026-03-18 16:46:36.490735772 +0000 UTC m=+137.020608953" watchObservedRunningTime="2026-03-18 16:46:36.491225823 +0000 UTC m=+137.021099009" Mar 18 16:46:36.536427 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:36.536397 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4b49dc5-e716-48c1-afae-5d4eeba82433-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-n88ss\" (UID: \"d4b49dc5-e716-48c1-afae-5d4eeba82433\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss" Mar 18 16:46:36.536624 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:36.536546 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:46:36.536624 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:36.536563 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-metrics-certs\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:36.536624 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:36.536614 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4b49dc5-e716-48c1-afae-5d4eeba82433-samples-operator-tls podName:d4b49dc5-e716-48c1-afae-5d4eeba82433 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:40.536592801 +0000 UTC m=+141.066465963 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d4b49dc5-e716-48c1-afae-5d4eeba82433-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-n88ss" (UID: "d4b49dc5-e716-48c1-afae-5d4eeba82433") : secret "samples-operator-tls" not found Mar 18 16:46:36.536781 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:36.536701 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:46:36.536781 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:36.536760 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-metrics-certs podName:09d9a175-36dd-4900-92c2-7b9a0986e68d nodeName:}" failed. No retries permitted until 2026-03-18 16:46:40.536741643 +0000 UTC m=+141.066614805 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-metrics-certs") pod "router-default-6785879c4-rgndg" (UID: "09d9a175-36dd-4900-92c2-7b9a0986e68d") : secret "router-metrics-certs-default" not found Mar 18 16:46:36.536882 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:36.536803 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:36.537232 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:36.536986 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle podName:09d9a175-36dd-4900-92c2-7b9a0986e68d nodeName:}" failed. No retries permitted until 2026-03-18 16:46:40.53695573 +0000 UTC m=+141.066828895 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle") pod "router-default-6785879c4-rgndg" (UID: "09d9a175-36dd-4900-92c2-7b9a0986e68d") : configmap references non-existent config key: service-ca.crt Mar 18 16:46:36.638174 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:36.638115 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-2t9zt\" (UID: \"f0e93606-444c-4b6a-8294-6603a2b534e8\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" Mar 18 16:46:36.638342 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:36.638258 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:36.638342 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:36.638329 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls podName:f0e93606-444c-4b6a-8294-6603a2b534e8 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:40.638309748 +0000 UTC m=+141.168182907 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-2t9zt" (UID: "f0e93606-444c-4b6a-8294-6603a2b534e8") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:40.018933 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:40.018907 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fd4lp_17730a74-9827-4a73-be22-72ef96f3aeb0/dns-node-resolver/0.log" Mar 18 16:46:40.569100 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:40.569050 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:40.569100 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:40.569108 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4b49dc5-e716-48c1-afae-5d4eeba82433-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-n88ss\" (UID: \"d4b49dc5-e716-48c1-afae-5d4eeba82433\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss" Mar 18 16:46:40.569329 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:40.569167 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-metrics-certs\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:40.569329 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:40.569219 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle podName:09d9a175-36dd-4900-92c2-7b9a0986e68d nodeName:}" failed. No retries permitted until 2026-03-18 16:46:48.569199737 +0000 UTC m=+149.099072896 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle") pod "router-default-6785879c4-rgndg" (UID: "09d9a175-36dd-4900-92c2-7b9a0986e68d") : configmap references non-existent config key: service-ca.crt Mar 18 16:46:40.569329 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:40.569282 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 18 16:46:40.569437 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:40.569332 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-metrics-certs podName:09d9a175-36dd-4900-92c2-7b9a0986e68d nodeName:}" failed. No retries permitted until 2026-03-18 16:46:48.569320026 +0000 UTC m=+149.099193185 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-metrics-certs") pod "router-default-6785879c4-rgndg" (UID: "09d9a175-36dd-4900-92c2-7b9a0986e68d") : secret "router-metrics-certs-default" not found Mar 18 16:46:40.569437 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:40.569282 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:46:40.569437 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:40.569362 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4b49dc5-e716-48c1-afae-5d4eeba82433-samples-operator-tls podName:d4b49dc5-e716-48c1-afae-5d4eeba82433 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:48.569354754 +0000 UTC m=+149.099227913 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d4b49dc5-e716-48c1-afae-5d4eeba82433-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-n88ss" (UID: "d4b49dc5-e716-48c1-afae-5d4eeba82433") : secret "samples-operator-tls" not found Mar 18 16:46:40.670578 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:40.670542 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-2t9zt\" (UID: \"f0e93606-444c-4b6a-8294-6603a2b534e8\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" Mar 18 16:46:40.670737 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:40.670661 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:40.670737 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:40.670716 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls podName:f0e93606-444c-4b6a-8294-6603a2b534e8 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:48.670699962 +0000 UTC m=+149.200573122 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-2t9zt" (UID: "f0e93606-444c-4b6a-8294-6603a2b534e8") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:41.419175 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:41.419149 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rwpwz_8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae/node-ca/0.log" Mar 18 16:46:42.810056 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:42.810029 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-866f46547-mr5cf_71b1e3b5-f3de-4e5a-855d-d72d883b476f/kube-storage-version-migrator-operator/0.log" Mar 18 16:46:48.632157 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:48.632116 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:48.632624 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:48.632167 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4b49dc5-e716-48c1-afae-5d4eeba82433-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-n88ss\" (UID: \"d4b49dc5-e716-48c1-afae-5d4eeba82433\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss" Mar 18 16:46:48.632624 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:48.632305 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle podName:09d9a175-36dd-4900-92c2-7b9a0986e68d nodeName:}" failed. No retries permitted until 2026-03-18 16:47:04.632281932 +0000 UTC m=+165.162155097 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle") pod "router-default-6785879c4-rgndg" (UID: "09d9a175-36dd-4900-92c2-7b9a0986e68d") : configmap references non-existent config key: service-ca.crt Mar 18 16:46:48.632624 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:48.632390 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-metrics-certs\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:48.634675 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:48.634651 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4b49dc5-e716-48c1-afae-5d4eeba82433-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-n88ss\" (UID: \"d4b49dc5-e716-48c1-afae-5d4eeba82433\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss" Mar 18 16:46:48.634779 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:48.634749 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/09d9a175-36dd-4900-92c2-7b9a0986e68d-metrics-certs\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:46:48.675888 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:48.675854 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss" Mar 18 16:46:48.733201 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:48.733166 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-2t9zt\" (UID: \"f0e93606-444c-4b6a-8294-6603a2b534e8\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" Mar 18 16:46:48.733364 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:48.733308 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:48.733432 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:48.733377 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls podName:f0e93606-444c-4b6a-8294-6603a2b534e8 nodeName:}" failed. No retries permitted until 2026-03-18 16:47:04.733356393 +0000 UTC m=+165.263229572 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-2t9zt" (UID: "f0e93606-444c-4b6a-8294-6603a2b534e8") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:48.803897 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:48.803862 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss"] Mar 18 16:46:49.488848 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:49.488808 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss" event={"ID":"d4b49dc5-e716-48c1-afae-5d4eeba82433","Type":"ContainerStarted","Data":"55b3aaf5985d2faf43f5256281107aeb12dcd58cdcebf1877e135ba24fd0ac81"} Mar 18 16:46:50.492425 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:50.492389 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss" event={"ID":"d4b49dc5-e716-48c1-afae-5d4eeba82433","Type":"ContainerStarted","Data":"50c739055c5394ec9c38ef0b8bf8086e9a45359de9c32ea4e8fd77a1c8d3c29d"} Mar 18 16:46:50.492425 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:50.492426 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss" event={"ID":"d4b49dc5-e716-48c1-afae-5d4eeba82433","Type":"ContainerStarted","Data":"c6a40cf44507621e55b1579f9da445f843e586d0eef1b31fc93c7f1302708a01"} Mar 18 16:46:50.509378 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:50.509328 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-n88ss" podStartSLOduration=17.015886802 podStartE2EDuration="18.509314764s" podCreationTimestamp="2026-03-18 16:46:32 +0000 UTC" firstStartedPulling="2026-03-18 16:46:48.847702516 +0000 UTC m=+149.377575674" lastFinishedPulling="2026-03-18 16:46:50.341130477 +0000 UTC m=+150.871003636" observedRunningTime="2026-03-18 16:46:50.508842668 +0000 UTC m=+151.038715849" watchObservedRunningTime="2026-03-18 16:46:50.509314764 +0000 UTC m=+151.039187945" Mar 18 16:46:55.815455 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:55.815414 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-zdwcx" podUID="bfa6872b-035c-45af-901b-55b2097c2b3d" Mar 18 16:46:55.820573 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:55.820536 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-98z82" podUID="54e235df-2ad8-4fbe-81bc-dfb66eafbf2b" Mar 18 16:46:56.504447 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:56.504419 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-98z82" Mar 18 16:46:56.504679 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:46:56.504481 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zdwcx" Mar 18 16:46:57.135620 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:46:57.135581 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-jf76n" podUID="bf79a021-e091-4ae2-bd19-2bd1205de781" Mar 18 16:47:00.728023 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:00.727902 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert\") pod \"ingress-canary-98z82\" (UID: \"54e235df-2ad8-4fbe-81bc-dfb66eafbf2b\") " pod="openshift-ingress-canary/ingress-canary-98z82" Mar 18 16:47:00.728023 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:00.727992 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:47:00.730513 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:00.730480 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54e235df-2ad8-4fbe-81bc-dfb66eafbf2b-cert\") pod \"ingress-canary-98z82\" (UID: \"54e235df-2ad8-4fbe-81bc-dfb66eafbf2b\") " pod="openshift-ingress-canary/ingress-canary-98z82" Mar 18 16:47:00.730922 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:00.730905 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfa6872b-035c-45af-901b-55b2097c2b3d-metrics-tls\") pod \"dns-default-zdwcx\" (UID: \"bfa6872b-035c-45af-901b-55b2097c2b3d\") " pod="openshift-dns/dns-default-zdwcx" Mar 18 16:47:01.007817 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.007735 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-775g5\"" Mar 18 16:47:01.008341 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.008321 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tcvm2\"" Mar 18 16:47:01.015925 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.015902 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-98z82" Mar 18 16:47:01.016060 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.015924 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zdwcx" Mar 18 16:47:01.149071 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.149037 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zdwcx"] Mar 18 16:47:01.153814 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:47:01.153784 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfa6872b_035c_45af_901b_55b2097c2b3d.slice/crio-71ec6eb1ab160a495f3404d60a51967b7a2b08a75f1c231146db51b06f12d4c7 WatchSource:0}: Error finding container 71ec6eb1ab160a495f3404d60a51967b7a2b08a75f1c231146db51b06f12d4c7: Status 404 returned error can't find the container with id 71ec6eb1ab160a495f3404d60a51967b7a2b08a75f1c231146db51b06f12d4c7 Mar 18 16:47:01.163266 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.163242 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-98z82"] Mar 18 16:47:01.166628 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:47:01.166603 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54e235df_2ad8_4fbe_81bc_dfb66eafbf2b.slice/crio-b1c6b8f49d19c6fd66e92285ca5213d17bef96a85e42439e6e5e265491eb78af WatchSource:0}: Error finding container b1c6b8f49d19c6fd66e92285ca5213d17bef96a85e42439e6e5e265491eb78af: Status 404 returned error can't find the container with id b1c6b8f49d19c6fd66e92285ca5213d17bef96a85e42439e6e5e265491eb78af Mar 18 16:47:01.260788 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.260712 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-k622p"] Mar 18 16:47:01.264191 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.264160 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-k622p" Mar 18 16:47:01.266198 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.266177 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Mar 18 16:47:01.270265 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.266684 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jqp6n\"" Mar 18 16:47:01.270265 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.266858 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Mar 18 16:47:01.270265 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.267743 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Mar 18 16:47:01.270265 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.268224 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Mar 18 16:47:01.281002 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.280957 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-k622p"] Mar 18 16:47:01.433961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.433925 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr7sh\" (UniqueName: \"kubernetes.io/projected/517e2087-dc4a-4eb9-a827-7a1760e0d5a2-kube-api-access-hr7sh\") pod \"insights-runtime-extractor-k622p\" (UID: \"517e2087-dc4a-4eb9-a827-7a1760e0d5a2\") " pod="openshift-insights/insights-runtime-extractor-k622p" Mar 18 16:47:01.434169 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.434000 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/517e2087-dc4a-4eb9-a827-7a1760e0d5a2-data-volume\") pod \"insights-runtime-extractor-k622p\" (UID: \"517e2087-dc4a-4eb9-a827-7a1760e0d5a2\") " pod="openshift-insights/insights-runtime-extractor-k622p" Mar 18 16:47:01.434169 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.434087 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/517e2087-dc4a-4eb9-a827-7a1760e0d5a2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-k622p\" (UID: \"517e2087-dc4a-4eb9-a827-7a1760e0d5a2\") " pod="openshift-insights/insights-runtime-extractor-k622p" Mar 18 16:47:01.434169 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.434105 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/517e2087-dc4a-4eb9-a827-7a1760e0d5a2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-k622p\" (UID: \"517e2087-dc4a-4eb9-a827-7a1760e0d5a2\") " pod="openshift-insights/insights-runtime-extractor-k622p" Mar 18 16:47:01.434169 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.434126 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/517e2087-dc4a-4eb9-a827-7a1760e0d5a2-crio-socket\") pod \"insights-runtime-extractor-k622p\" (UID: \"517e2087-dc4a-4eb9-a827-7a1760e0d5a2\") " pod="openshift-insights/insights-runtime-extractor-k622p" Mar 18 16:47:01.515515 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.515432 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-98z82" event={"ID":"54e235df-2ad8-4fbe-81bc-dfb66eafbf2b","Type":"ContainerStarted","Data":"b1c6b8f49d19c6fd66e92285ca5213d17bef96a85e42439e6e5e265491eb78af"} Mar 18 16:47:01.516535 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.516506 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zdwcx" event={"ID":"bfa6872b-035c-45af-901b-55b2097c2b3d","Type":"ContainerStarted","Data":"71ec6eb1ab160a495f3404d60a51967b7a2b08a75f1c231146db51b06f12d4c7"} Mar 18 16:47:01.534440 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.534399 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/517e2087-dc4a-4eb9-a827-7a1760e0d5a2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-k622p\" (UID: \"517e2087-dc4a-4eb9-a827-7a1760e0d5a2\") " pod="openshift-insights/insights-runtime-extractor-k622p" Mar 18 16:47:01.534440 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.534439 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/517e2087-dc4a-4eb9-a827-7a1760e0d5a2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-k622p\" (UID: \"517e2087-dc4a-4eb9-a827-7a1760e0d5a2\") " pod="openshift-insights/insights-runtime-extractor-k622p" Mar 18 16:47:01.534681 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.534466 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/517e2087-dc4a-4eb9-a827-7a1760e0d5a2-crio-socket\") pod \"insights-runtime-extractor-k622p\" (UID: \"517e2087-dc4a-4eb9-a827-7a1760e0d5a2\") " pod="openshift-insights/insights-runtime-extractor-k622p" Mar 18 16:47:01.534681 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.534506 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hr7sh\" (UniqueName: \"kubernetes.io/projected/517e2087-dc4a-4eb9-a827-7a1760e0d5a2-kube-api-access-hr7sh\") pod \"insights-runtime-extractor-k622p\" (UID: \"517e2087-dc4a-4eb9-a827-7a1760e0d5a2\") " pod="openshift-insights/insights-runtime-extractor-k622p" Mar 18 16:47:01.534681 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.534540 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/517e2087-dc4a-4eb9-a827-7a1760e0d5a2-data-volume\") pod \"insights-runtime-extractor-k622p\" (UID: \"517e2087-dc4a-4eb9-a827-7a1760e0d5a2\") " pod="openshift-insights/insights-runtime-extractor-k622p" Mar 18 16:47:01.534681 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.534615 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/517e2087-dc4a-4eb9-a827-7a1760e0d5a2-crio-socket\") pod \"insights-runtime-extractor-k622p\" (UID: \"517e2087-dc4a-4eb9-a827-7a1760e0d5a2\") " pod="openshift-insights/insights-runtime-extractor-k622p" Mar 18 16:47:01.535010 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.534967 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/517e2087-dc4a-4eb9-a827-7a1760e0d5a2-data-volume\") pod \"insights-runtime-extractor-k622p\" (UID: \"517e2087-dc4a-4eb9-a827-7a1760e0d5a2\") " pod="openshift-insights/insights-runtime-extractor-k622p" Mar 18 16:47:01.535102 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.535067 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/517e2087-dc4a-4eb9-a827-7a1760e0d5a2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-k622p\" (UID: \"517e2087-dc4a-4eb9-a827-7a1760e0d5a2\") " pod="openshift-insights/insights-runtime-extractor-k622p" Mar 18 16:47:01.537832 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.537811 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/517e2087-dc4a-4eb9-a827-7a1760e0d5a2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-k622p\" (UID: \"517e2087-dc4a-4eb9-a827-7a1760e0d5a2\") " pod="openshift-insights/insights-runtime-extractor-k622p" Mar 18 16:47:01.549577 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.549546 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr7sh\" (UniqueName: \"kubernetes.io/projected/517e2087-dc4a-4eb9-a827-7a1760e0d5a2-kube-api-access-hr7sh\") pod \"insights-runtime-extractor-k622p\" (UID: \"517e2087-dc4a-4eb9-a827-7a1760e0d5a2\") " pod="openshift-insights/insights-runtime-extractor-k622p" Mar 18 16:47:01.575153 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.575115 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-k622p" Mar 18 16:47:01.714758 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:01.714727 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-k622p"] Mar 18 16:47:01.720223 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:47:01.720191 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod517e2087_dc4a_4eb9_a827_7a1760e0d5a2.slice/crio-c2cf2a715209e140257fa8fcfdc190718457ef758a02bbe8cf96f66093c1dbd6 WatchSource:0}: Error finding container c2cf2a715209e140257fa8fcfdc190718457ef758a02bbe8cf96f66093c1dbd6: Status 404 returned error can't find the container with id c2cf2a715209e140257fa8fcfdc190718457ef758a02bbe8cf96f66093c1dbd6 Mar 18 16:47:02.522407 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:02.522376 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-k622p" event={"ID":"517e2087-dc4a-4eb9-a827-7a1760e0d5a2","Type":"ContainerStarted","Data":"654129caf469f62b3f83e147e8fbcffc6332b8fe738f6a6917bff1423d5ecca7"} Mar 18 16:47:02.522751 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:02.522416 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-k622p" event={"ID":"517e2087-dc4a-4eb9-a827-7a1760e0d5a2","Type":"ContainerStarted","Data":"c2cf2a715209e140257fa8fcfdc190718457ef758a02bbe8cf96f66093c1dbd6"} Mar 18 16:47:03.527061 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:03.527023 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-k622p" event={"ID":"517e2087-dc4a-4eb9-a827-7a1760e0d5a2","Type":"ContainerStarted","Data":"01494f303331dc42bbcc7f938a4b248caa5e992c92880f48786c51291344a890"} Mar 18 16:47:03.528446 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:03.528416 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-98z82" event={"ID":"54e235df-2ad8-4fbe-81bc-dfb66eafbf2b","Type":"ContainerStarted","Data":"5fdf94dcdbeb1088a94642cf04df9e38b4f28b82e12bc0b71e123917c15f76fd"} Mar 18 16:47:03.530199 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:03.530175 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zdwcx" event={"ID":"bfa6872b-035c-45af-901b-55b2097c2b3d","Type":"ContainerStarted","Data":"3b91433f2afc807a86489bb638a720da909eedeadcba689264070d266997b6ef"} Mar 18 16:47:03.530315 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:03.530222 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zdwcx" event={"ID":"bfa6872b-035c-45af-901b-55b2097c2b3d","Type":"ContainerStarted","Data":"6fab42373b090679e0355c127a389501f95c01e8c2e5a4929b14ee6cf4f4cffd"} Mar 18 16:47:03.530391 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:03.530337 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-zdwcx" Mar 18 16:47:03.543745 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:03.543697 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-98z82" podStartSLOduration=129.586532792 podStartE2EDuration="2m11.543679231s" podCreationTimestamp="2026-03-18 16:44:52 +0000 UTC" firstStartedPulling="2026-03-18 16:47:01.168338798 +0000 UTC m=+161.698211958" lastFinishedPulling="2026-03-18 16:47:03.125485224 +0000 UTC m=+163.655358397" observedRunningTime="2026-03-18 16:47:03.543467742 +0000 UTC m=+164.073340924" watchObservedRunningTime="2026-03-18 16:47:03.543679231 +0000 UTC m=+164.073552413" Mar 18 16:47:03.559626 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:03.559333 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zdwcx" podStartSLOduration=130.283137042 podStartE2EDuration="2m11.559315812s" podCreationTimestamp="2026-03-18 16:44:52 +0000 UTC" firstStartedPulling="2026-03-18 16:47:01.156845809 +0000 UTC m=+161.686718972" lastFinishedPulling="2026-03-18 16:47:02.433024581 +0000 UTC m=+162.962897742" observedRunningTime="2026-03-18 16:47:03.558895594 +0000 UTC m=+164.088768789" watchObservedRunningTime="2026-03-18 16:47:03.559315812 +0000 UTC m=+164.089188994" Mar 18 16:47:04.534582 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:04.534541 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-k622p" event={"ID":"517e2087-dc4a-4eb9-a827-7a1760e0d5a2","Type":"ContainerStarted","Data":"ba846449e97b9b917b175ef00b3697688442df362c177c958a9904b6d90513d3"} Mar 18 16:47:04.551520 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:04.551475 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-k622p" podStartSLOduration=0.973741297 podStartE2EDuration="3.551458918s" podCreationTimestamp="2026-03-18 16:47:01 +0000 UTC" firstStartedPulling="2026-03-18 16:47:01.801174531 +0000 UTC m=+162.331047697" lastFinishedPulling="2026-03-18 16:47:04.378892155 +0000 UTC m=+164.908765318" observedRunningTime="2026-03-18 16:47:04.550610985 +0000 UTC m=+165.080484176" watchObservedRunningTime="2026-03-18 16:47:04.551458918 +0000 UTC m=+165.081332098" Mar 18 16:47:04.661234 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:04.661193 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:47:04.662168 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:04.662145 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09d9a175-36dd-4900-92c2-7b9a0986e68d-service-ca-bundle\") pod \"router-default-6785879c4-rgndg\" (UID: \"09d9a175-36dd-4900-92c2-7b9a0986e68d\") " pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:47:04.762551 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:04.762516 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-2t9zt\" (UID: \"f0e93606-444c-4b6a-8294-6603a2b534e8\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" Mar 18 16:47:04.764850 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:04.764831 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0e93606-444c-4b6a-8294-6603a2b534e8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-2t9zt\" (UID: \"f0e93606-444c-4b6a-8294-6603a2b534e8\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" Mar 18 16:47:04.880945 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:04.880897 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:47:04.986938 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:04.986911 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" Mar 18 16:47:05.001160 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:05.001134 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6785879c4-rgndg"] Mar 18 16:47:05.003697 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:47:05.003672 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09d9a175_36dd_4900_92c2_7b9a0986e68d.slice/crio-0650222ecf9e63ced47510f8e13944b30811327db34bdcdc2111340169ab8ad1 WatchSource:0}: Error finding container 0650222ecf9e63ced47510f8e13944b30811327db34bdcdc2111340169ab8ad1: Status 404 returned error can't find the container with id 0650222ecf9e63ced47510f8e13944b30811327db34bdcdc2111340169ab8ad1 Mar 18 16:47:05.106585 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:05.106534 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt"] Mar 18 16:47:05.111051 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:47:05.111020 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0e93606_444c_4b6a_8294_6603a2b534e8.slice/crio-44608fa10965c25596671a5ba724a006158072068b8ff12e5413435781726bfe WatchSource:0}: Error finding container 44608fa10965c25596671a5ba724a006158072068b8ff12e5413435781726bfe: Status 404 returned error can't find the container with id 44608fa10965c25596671a5ba724a006158072068b8ff12e5413435781726bfe Mar 18 16:47:05.538491 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:05.538453 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" event={"ID":"f0e93606-444c-4b6a-8294-6603a2b534e8","Type":"ContainerStarted","Data":"44608fa10965c25596671a5ba724a006158072068b8ff12e5413435781726bfe"} Mar 18 16:47:05.539717 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:05.539686 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6785879c4-rgndg" event={"ID":"09d9a175-36dd-4900-92c2-7b9a0986e68d","Type":"ContainerStarted","Data":"deee3866bd113ed90058daa0fbf6f2789147ef60f9ba369c3b8b9c8fad11c3b7"} Mar 18 16:47:05.539717 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:05.539720 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6785879c4-rgndg" event={"ID":"09d9a175-36dd-4900-92c2-7b9a0986e68d","Type":"ContainerStarted","Data":"0650222ecf9e63ced47510f8e13944b30811327db34bdcdc2111340169ab8ad1"} Mar 18 16:47:05.558009 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:05.557946 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-6785879c4-rgndg" podStartSLOduration=33.557931494 podStartE2EDuration="33.557931494s" podCreationTimestamp="2026-03-18 16:46:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:47:05.556440724 +0000 UTC m=+166.086313905" watchObservedRunningTime="2026-03-18 16:47:05.557931494 +0000 UTC m=+166.087804674" Mar 18 16:47:05.881829 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:05.881729 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:47:05.884673 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:05.884648 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:47:06.543361 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:06.543321 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:47:06.544785 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:06.544755 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-6785879c4-rgndg" Mar 18 16:47:07.329076 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:07.329042 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-vqvsg"] Mar 18 16:47:07.331731 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:07.331713 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-vqvsg" Mar 18 16:47:07.333649 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:07.333625 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Mar 18 16:47:07.333758 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:07.333694 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-v7nrr\"" Mar 18 16:47:07.339452 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:07.339425 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-vqvsg"] Mar 18 16:47:07.384083 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:07.384037 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/14be01bd-4943-46df-a51e-deeac401e9e0-tls-certificates\") pod \"prometheus-operator-admission-webhook-8444df798b-vqvsg\" (UID: \"14be01bd-4943-46df-a51e-deeac401e9e0\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-vqvsg" Mar 18 16:47:07.484423 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:07.484382 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/14be01bd-4943-46df-a51e-deeac401e9e0-tls-certificates\") pod \"prometheus-operator-admission-webhook-8444df798b-vqvsg\" (UID: \"14be01bd-4943-46df-a51e-deeac401e9e0\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-vqvsg" Mar 18 16:47:07.484580 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:47:07.484502 2571 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Mar 18 16:47:07.484580 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:47:07.484566 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14be01bd-4943-46df-a51e-deeac401e9e0-tls-certificates podName:14be01bd-4943-46df-a51e-deeac401e9e0 nodeName:}" failed. No retries permitted until 2026-03-18 16:47:07.984549959 +0000 UTC m=+168.514423118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/14be01bd-4943-46df-a51e-deeac401e9e0-tls-certificates") pod "prometheus-operator-admission-webhook-8444df798b-vqvsg" (UID: "14be01bd-4943-46df-a51e-deeac401e9e0") : secret "prometheus-operator-admission-webhook-tls" not found Mar 18 16:47:07.547309 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:07.547271 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" event={"ID":"f0e93606-444c-4b6a-8294-6603a2b534e8","Type":"ContainerStarted","Data":"4a2abad0c5a6601ae25def1f6e10040d640efe3469695beb042ded63aec69272"} Mar 18 16:47:07.562993 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:07.562927 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-2t9zt" podStartSLOduration=33.8347966 podStartE2EDuration="35.562910483s" podCreationTimestamp="2026-03-18 16:46:32 +0000 UTC" firstStartedPulling="2026-03-18 16:47:05.113079032 +0000 UTC m=+165.642952192" lastFinishedPulling="2026-03-18 16:47:06.841192916 +0000 UTC m=+167.371066075" observedRunningTime="2026-03-18 16:47:07.562447694 +0000 UTC m=+168.092320877" watchObservedRunningTime="2026-03-18 16:47:07.562910483 +0000 UTC m=+168.092783663" Mar 18 16:47:07.987210 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:07.987171 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/14be01bd-4943-46df-a51e-deeac401e9e0-tls-certificates\") pod \"prometheus-operator-admission-webhook-8444df798b-vqvsg\" (UID: \"14be01bd-4943-46df-a51e-deeac401e9e0\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-vqvsg" Mar 18 16:47:07.989681 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:07.989661 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/14be01bd-4943-46df-a51e-deeac401e9e0-tls-certificates\") pod \"prometheus-operator-admission-webhook-8444df798b-vqvsg\" (UID: \"14be01bd-4943-46df-a51e-deeac401e9e0\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-vqvsg" Mar 18 16:47:08.121560 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:08.121525 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:47:08.242049 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:08.241935 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-vqvsg" Mar 18 16:47:08.360149 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:08.360117 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-vqvsg"] Mar 18 16:47:08.363139 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:47:08.363110 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14be01bd_4943_46df_a51e_deeac401e9e0.slice/crio-d2ee4b1c08f21a034b0466634267594a5c1e1471b33b390b25ec9e6712c040d2 WatchSource:0}: Error finding container d2ee4b1c08f21a034b0466634267594a5c1e1471b33b390b25ec9e6712c040d2: Status 404 returned error can't find the container with id d2ee4b1c08f21a034b0466634267594a5c1e1471b33b390b25ec9e6712c040d2 Mar 18 16:47:08.551416 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:08.551330 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-vqvsg" event={"ID":"14be01bd-4943-46df-a51e-deeac401e9e0","Type":"ContainerStarted","Data":"d2ee4b1c08f21a034b0466634267594a5c1e1471b33b390b25ec9e6712c040d2"} Mar 18 16:47:09.554682 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:09.554646 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-vqvsg" event={"ID":"14be01bd-4943-46df-a51e-deeac401e9e0","Type":"ContainerStarted","Data":"b0c89629db36cc6bbe4b751f1cac7fa753caffa2fc93aac49ea47797bda7e5f7"} Mar 18 16:47:09.555155 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:09.554860 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-vqvsg" Mar 18 16:47:09.559606 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:09.559586 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-vqvsg" Mar 18 16:47:09.570205 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:09.570087 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-vqvsg" podStartSLOduration=1.501690387 podStartE2EDuration="2.570073064s" podCreationTimestamp="2026-03-18 16:47:07 +0000 UTC" firstStartedPulling="2026-03-18 16:47:08.365468497 +0000 UTC m=+168.895341656" lastFinishedPulling="2026-03-18 16:47:09.433851173 +0000 UTC m=+169.963724333" observedRunningTime="2026-03-18 16:47:09.569967745 +0000 UTC m=+170.099840926" watchObservedRunningTime="2026-03-18 16:47:09.570073064 +0000 UTC m=+170.099946245" Mar 18 16:47:10.407655 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:10.407622 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-6b948c769-g598t"] Mar 18 16:47:10.410461 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:10.410443 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" Mar 18 16:47:10.412443 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:10.412423 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-sqxbj\"" Mar 18 16:47:10.412745 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:10.412722 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Mar 18 16:47:10.413185 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:10.413166 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Mar 18 16:47:10.413259 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:10.413230 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Mar 18 16:47:10.442601 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:10.442575 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6b948c769-g598t"] Mar 18 16:47:10.506372 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:10.506340 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqlt6\" (UniqueName: \"kubernetes.io/projected/8909e55e-6aec-466e-8776-e284dd7ecf1f-kube-api-access-bqlt6\") pod \"prometheus-operator-6b948c769-g598t\" (UID: \"8909e55e-6aec-466e-8776-e284dd7ecf1f\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" Mar 18 16:47:10.506514 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:10.506384 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/8909e55e-6aec-466e-8776-e284dd7ecf1f-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-g598t\" (UID: \"8909e55e-6aec-466e-8776-e284dd7ecf1f\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" Mar 18 16:47:10.506514 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:10.506418 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8909e55e-6aec-466e-8776-e284dd7ecf1f-metrics-client-ca\") pod \"prometheus-operator-6b948c769-g598t\" (UID: \"8909e55e-6aec-466e-8776-e284dd7ecf1f\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" Mar 18 16:47:10.506514 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:10.506501 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8909e55e-6aec-466e-8776-e284dd7ecf1f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6b948c769-g598t\" (UID: \"8909e55e-6aec-466e-8776-e284dd7ecf1f\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" Mar 18 16:47:10.606778 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:10.606747 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/8909e55e-6aec-466e-8776-e284dd7ecf1f-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-g598t\" (UID: \"8909e55e-6aec-466e-8776-e284dd7ecf1f\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" Mar 18 16:47:10.607208 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:10.606784 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8909e55e-6aec-466e-8776-e284dd7ecf1f-metrics-client-ca\") pod \"prometheus-operator-6b948c769-g598t\" (UID: \"8909e55e-6aec-466e-8776-e284dd7ecf1f\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" Mar 18 16:47:10.607208 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:10.606847 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8909e55e-6aec-466e-8776-e284dd7ecf1f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6b948c769-g598t\" (UID: \"8909e55e-6aec-466e-8776-e284dd7ecf1f\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" Mar 18 16:47:10.607208 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:47:10.606905 2571 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Mar 18 16:47:10.607208 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:47:10.606991 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8909e55e-6aec-466e-8776-e284dd7ecf1f-prometheus-operator-tls podName:8909e55e-6aec-466e-8776-e284dd7ecf1f nodeName:}" failed. No retries permitted until 2026-03-18 16:47:11.106949922 +0000 UTC m=+171.636823094 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/8909e55e-6aec-466e-8776-e284dd7ecf1f-prometheus-operator-tls") pod "prometheus-operator-6b948c769-g598t" (UID: "8909e55e-6aec-466e-8776-e284dd7ecf1f") : secret "prometheus-operator-tls" not found Mar 18 16:47:10.607208 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:10.607086 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqlt6\" (UniqueName: \"kubernetes.io/projected/8909e55e-6aec-466e-8776-e284dd7ecf1f-kube-api-access-bqlt6\") pod \"prometheus-operator-6b948c769-g598t\" (UID: \"8909e55e-6aec-466e-8776-e284dd7ecf1f\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" Mar 18 16:47:10.608958 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:10.608930 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8909e55e-6aec-466e-8776-e284dd7ecf1f-metrics-client-ca\") pod \"prometheus-operator-6b948c769-g598t\" (UID: \"8909e55e-6aec-466e-8776-e284dd7ecf1f\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" Mar 18 16:47:10.609997 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:10.609951 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8909e55e-6aec-466e-8776-e284dd7ecf1f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6b948c769-g598t\" (UID: \"8909e55e-6aec-466e-8776-e284dd7ecf1f\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" Mar 18 16:47:10.617743 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:10.617723 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqlt6\" (UniqueName: \"kubernetes.io/projected/8909e55e-6aec-466e-8776-e284dd7ecf1f-kube-api-access-bqlt6\") pod \"prometheus-operator-6b948c769-g598t\" (UID: \"8909e55e-6aec-466e-8776-e284dd7ecf1f\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" Mar 18 16:47:11.109644 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:11.109604 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/8909e55e-6aec-466e-8776-e284dd7ecf1f-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-g598t\" (UID: \"8909e55e-6aec-466e-8776-e284dd7ecf1f\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" Mar 18 16:47:11.111981 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:11.111940 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/8909e55e-6aec-466e-8776-e284dd7ecf1f-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-g598t\" (UID: \"8909e55e-6aec-466e-8776-e284dd7ecf1f\") " pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" Mar 18 16:47:11.341990 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:11.341944 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" Mar 18 16:47:11.457258 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:11.457086 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6b948c769-g598t"] Mar 18 16:47:11.461513 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:47:11.461482 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8909e55e_6aec_466e_8776_e284dd7ecf1f.slice/crio-c2ad8ce36998a51a3b0e2eb50a90df366085f9512d6d11268a33bee82dfb0838 WatchSource:0}: Error finding container c2ad8ce36998a51a3b0e2eb50a90df366085f9512d6d11268a33bee82dfb0838: Status 404 returned error can't find the container with id c2ad8ce36998a51a3b0e2eb50a90df366085f9512d6d11268a33bee82dfb0838 Mar 18 16:47:11.561249 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:11.561211 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" event={"ID":"8909e55e-6aec-466e-8776-e284dd7ecf1f","Type":"ContainerStarted","Data":"c2ad8ce36998a51a3b0e2eb50a90df366085f9512d6d11268a33bee82dfb0838"} Mar 18 16:47:13.536836 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:13.536811 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zdwcx" Mar 18 16:47:13.570096 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:13.570059 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" event={"ID":"8909e55e-6aec-466e-8776-e284dd7ecf1f","Type":"ContainerStarted","Data":"40fe13ccbf51fa583cdaeeb3106d08e52ec84ea01239449ca66b0d2cff10b7b1"} Mar 18 16:47:13.570096 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:13.570100 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" event={"ID":"8909e55e-6aec-466e-8776-e284dd7ecf1f","Type":"ContainerStarted","Data":"d0888a0e1381497216de4b5bc94232c8daec4f0d65dcfa435bea84d0bf7a9137"} Mar 18 16:47:13.586804 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:13.586742 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-6b948c769-g598t" podStartSLOduration=2.048785374 podStartE2EDuration="3.586722614s" podCreationTimestamp="2026-03-18 16:47:10 +0000 UTC" firstStartedPulling="2026-03-18 16:47:11.46288109 +0000 UTC m=+171.992754249" lastFinishedPulling="2026-03-18 16:47:13.000818306 +0000 UTC m=+173.530691489" observedRunningTime="2026-03-18 16:47:13.586029368 +0000 UTC m=+174.115902550" watchObservedRunningTime="2026-03-18 16:47:13.586722614 +0000 UTC m=+174.116595795" Mar 18 16:47:15.748626 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.748585 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-6df7999c47-z2nll"] Mar 18 16:47:15.751282 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.751259 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.756435 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.756403 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Mar 18 16:47:15.757238 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.757216 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Mar 18 16:47:15.757365 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.757238 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-g5mvs\"" Mar 18 16:47:15.757365 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.757273 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Mar 18 16:47:15.757729 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.757651 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-82cfk"] Mar 18 16:47:15.760212 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.760197 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.762490 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.762470 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qqtwn\"" Mar 18 16:47:15.762633 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.762617 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Mar 18 16:47:15.762756 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.762741 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Mar 18 16:47:15.766999 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.763259 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Mar 18 16:47:15.768040 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.767631 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-6df7999c47-z2nll"] Mar 18 16:47:15.838438 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.838389 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/03c91678-fee8-415c-9e15-85a3cedd1294-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-z2nll\" (UID: \"03c91678-fee8-415c-9e15-85a3cedd1294\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.838605 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.838447 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0d4a41e6-6f72-45dd-a626-762b2819804c-node-exporter-textfile\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.838605 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.838525 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03c91678-fee8-415c-9e15-85a3cedd1294-metrics-client-ca\") pod \"kube-state-metrics-6df7999c47-z2nll\" (UID: \"03c91678-fee8-415c-9e15-85a3cedd1294\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.838605 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.838568 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68qcg\" (UniqueName: \"kubernetes.io/projected/03c91678-fee8-415c-9e15-85a3cedd1294-kube-api-access-68qcg\") pod \"kube-state-metrics-6df7999c47-z2nll\" (UID: \"03c91678-fee8-415c-9e15-85a3cedd1294\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.838605 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.838600 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d4a41e6-6f72-45dd-a626-762b2819804c-metrics-client-ca\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.838748 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.838660 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0d4a41e6-6f72-45dd-a626-762b2819804c-root\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.838748 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.838690 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/03c91678-fee8-415c-9e15-85a3cedd1294-volume-directive-shadow\") pod \"kube-state-metrics-6df7999c47-z2nll\" (UID: \"03c91678-fee8-415c-9e15-85a3cedd1294\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.838748 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.838714 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0d4a41e6-6f72-45dd-a626-762b2819804c-node-exporter-accelerators-collector-config\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.838748 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.838730 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d4a41e6-6f72-45dd-a626-762b2819804c-sys\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.838748 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.838746 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0d4a41e6-6f72-45dd-a626-762b2819804c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.838935 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.838762 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0d4a41e6-6f72-45dd-a626-762b2819804c-node-exporter-wtmp\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.838935 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.838792 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkg79\" (UniqueName: \"kubernetes.io/projected/0d4a41e6-6f72-45dd-a626-762b2819804c-kube-api-access-nkg79\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.838935 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.838886 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/03c91678-fee8-415c-9e15-85a3cedd1294-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-6df7999c47-z2nll\" (UID: \"03c91678-fee8-415c-9e15-85a3cedd1294\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.838935 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.838906 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03c91678-fee8-415c-9e15-85a3cedd1294-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-6df7999c47-z2nll\" (UID: \"03c91678-fee8-415c-9e15-85a3cedd1294\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.838935 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.838926 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0d4a41e6-6f72-45dd-a626-762b2819804c-node-exporter-tls\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.939235 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.939196 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/03c91678-fee8-415c-9e15-85a3cedd1294-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-6df7999c47-z2nll\" (UID: \"03c91678-fee8-415c-9e15-85a3cedd1294\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.939235 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.939241 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03c91678-fee8-415c-9e15-85a3cedd1294-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-6df7999c47-z2nll\" (UID: \"03c91678-fee8-415c-9e15-85a3cedd1294\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.939497 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.939360 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0d4a41e6-6f72-45dd-a626-762b2819804c-node-exporter-tls\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.939497 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.939407 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/03c91678-fee8-415c-9e15-85a3cedd1294-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-z2nll\" (UID: \"03c91678-fee8-415c-9e15-85a3cedd1294\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.939497 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.939456 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0d4a41e6-6f72-45dd-a626-762b2819804c-node-exporter-textfile\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.939497 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.939477 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03c91678-fee8-415c-9e15-85a3cedd1294-metrics-client-ca\") pod \"kube-state-metrics-6df7999c47-z2nll\" (UID: \"03c91678-fee8-415c-9e15-85a3cedd1294\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.939497 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:47:15.939485 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Mar 18 16:47:15.939743 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.939502 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68qcg\" (UniqueName: \"kubernetes.io/projected/03c91678-fee8-415c-9e15-85a3cedd1294-kube-api-access-68qcg\") pod \"kube-state-metrics-6df7999c47-z2nll\" (UID: \"03c91678-fee8-415c-9e15-85a3cedd1294\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.939743 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.939534 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d4a41e6-6f72-45dd-a626-762b2819804c-metrics-client-ca\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.939743 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:47:15.939551 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d4a41e6-6f72-45dd-a626-762b2819804c-node-exporter-tls podName:0d4a41e6-6f72-45dd-a626-762b2819804c nodeName:}" failed. No retries permitted until 2026-03-18 16:47:16.439534004 +0000 UTC m=+176.969407163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/0d4a41e6-6f72-45dd-a626-762b2819804c-node-exporter-tls") pod "node-exporter-82cfk" (UID: "0d4a41e6-6f72-45dd-a626-762b2819804c") : secret "node-exporter-tls" not found Mar 18 16:47:15.939743 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.939617 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0d4a41e6-6f72-45dd-a626-762b2819804c-root\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.939743 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.939688 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/03c91678-fee8-415c-9e15-85a3cedd1294-volume-directive-shadow\") pod \"kube-state-metrics-6df7999c47-z2nll\" (UID: \"03c91678-fee8-415c-9e15-85a3cedd1294\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.939743 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.939721 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0d4a41e6-6f72-45dd-a626-762b2819804c-node-exporter-accelerators-collector-config\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.940078 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.939751 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d4a41e6-6f72-45dd-a626-762b2819804c-sys\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.940078 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.939775 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0d4a41e6-6f72-45dd-a626-762b2819804c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.940078 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.939804 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0d4a41e6-6f72-45dd-a626-762b2819804c-node-exporter-wtmp\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.940078 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.939830 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkg79\" (UniqueName: \"kubernetes.io/projected/0d4a41e6-6f72-45dd-a626-762b2819804c-kube-api-access-nkg79\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.940078 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.939832 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0d4a41e6-6f72-45dd-a626-762b2819804c-node-exporter-textfile\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.940078 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.939936 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0d4a41e6-6f72-45dd-a626-762b2819804c-sys\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.940335 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.940125 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/03c91678-fee8-415c-9e15-85a3cedd1294-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-6df7999c47-z2nll\" (UID: \"03c91678-fee8-415c-9e15-85a3cedd1294\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.940335 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.940164 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0d4a41e6-6f72-45dd-a626-762b2819804c-node-exporter-wtmp\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.940335 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.940194 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03c91678-fee8-415c-9e15-85a3cedd1294-metrics-client-ca\") pod \"kube-state-metrics-6df7999c47-z2nll\" (UID: \"03c91678-fee8-415c-9e15-85a3cedd1294\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.940476 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.940333 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0d4a41e6-6f72-45dd-a626-762b2819804c-root\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.940476 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.940356 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0d4a41e6-6f72-45dd-a626-762b2819804c-node-exporter-accelerators-collector-config\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.940476 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.940380 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d4a41e6-6f72-45dd-a626-762b2819804c-metrics-client-ca\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.940646 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.940623 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/03c91678-fee8-415c-9e15-85a3cedd1294-volume-directive-shadow\") pod \"kube-state-metrics-6df7999c47-z2nll\" (UID: \"03c91678-fee8-415c-9e15-85a3cedd1294\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.942132 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.942101 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03c91678-fee8-415c-9e15-85a3cedd1294-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-6df7999c47-z2nll\" (UID: \"03c91678-fee8-415c-9e15-85a3cedd1294\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.942132 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.942122 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0d4a41e6-6f72-45dd-a626-762b2819804c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:15.942441 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.942417 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/03c91678-fee8-415c-9e15-85a3cedd1294-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-z2nll\" (UID: \"03c91678-fee8-415c-9e15-85a3cedd1294\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.949597 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.949568 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68qcg\" (UniqueName: \"kubernetes.io/projected/03c91678-fee8-415c-9e15-85a3cedd1294-kube-api-access-68qcg\") pod \"kube-state-metrics-6df7999c47-z2nll\" (UID: \"03c91678-fee8-415c-9e15-85a3cedd1294\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:15.949704 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:15.949597 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkg79\" (UniqueName: \"kubernetes.io/projected/0d4a41e6-6f72-45dd-a626-762b2819804c-kube-api-access-nkg79\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:16.060394 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.060314 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" Mar 18 16:47:16.176891 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.176868 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-6df7999c47-z2nll"] Mar 18 16:47:16.179222 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:47:16.179194 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03c91678_fee8_415c_9e15_85a3cedd1294.slice/crio-5f1663ca0f66b314809182af1f75f20cbf2afb98daa768ffdf7bc9edca19ce22 WatchSource:0}: Error finding container 5f1663ca0f66b314809182af1f75f20cbf2afb98daa768ffdf7bc9edca19ce22: Status 404 returned error can't find the container with id 5f1663ca0f66b314809182af1f75f20cbf2afb98daa768ffdf7bc9edca19ce22 Mar 18 16:47:16.442525 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.442491 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0d4a41e6-6f72-45dd-a626-762b2819804c-node-exporter-tls\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:16.444807 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.444786 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0d4a41e6-6f72-45dd-a626-762b2819804c-node-exporter-tls\") pod \"node-exporter-82cfk\" (UID: \"0d4a41e6-6f72-45dd-a626-762b2819804c\") " pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:16.577526 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.577492 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" event={"ID":"03c91678-fee8-415c-9e15-85a3cedd1294","Type":"ContainerStarted","Data":"5f1663ca0f66b314809182af1f75f20cbf2afb98daa768ffdf7bc9edca19ce22"} Mar 18 16:47:16.670666 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.670630 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-82cfk" Mar 18 16:47:16.680415 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:47:16.680375 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d4a41e6_6f72_45dd_a626_762b2819804c.slice/crio-1e886d2f49dd0b88a2b6e5904a0cf82a30201dade2db9750d1f13dcab6e041e4 WatchSource:0}: Error finding container 1e886d2f49dd0b88a2b6e5904a0cf82a30201dade2db9750d1f13dcab6e041e4: Status 404 returned error can't find the container with id 1e886d2f49dd0b88a2b6e5904a0cf82a30201dade2db9750d1f13dcab6e041e4 Mar 18 16:47:16.857004 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.856911 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:47:16.861594 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.861571 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:16.878896 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.877929 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Mar 18 16:47:16.878896 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.877940 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Mar 18 16:47:16.878896 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.878283 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Mar 18 16:47:16.883110 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.883087 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-4ltrm\"" Mar 18 16:47:16.883793 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.883423 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Mar 18 16:47:16.884267 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.884244 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Mar 18 16:47:16.884540 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.884387 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Mar 18 16:47:16.884540 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.884392 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Mar 18 16:47:16.884540 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.884449 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Mar 18 16:47:16.885102 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.885072 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Mar 18 16:47:16.895018 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.894950 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:47:16.955484 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.954919 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b122b28a-3083-417e-b44b-a28004869260-config-out\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:16.955484 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.954966 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-web-config\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:16.955484 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.955017 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b122b28a-3083-417e-b44b-a28004869260-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:16.955484 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.955046 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:16.955484 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.955104 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b122b28a-3083-417e-b44b-a28004869260-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:16.955484 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.955155 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b122b28a-3083-417e-b44b-a28004869260-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:16.955484 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.955187 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:16.955484 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.955214 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:16.955484 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.955246 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:16.955484 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.955272 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz2tn\" (UniqueName: \"kubernetes.io/projected/b122b28a-3083-417e-b44b-a28004869260-kube-api-access-rz2tn\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:16.955484 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.955303 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b122b28a-3083-417e-b44b-a28004869260-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:16.955484 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.955338 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-config-volume\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:16.955484 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:16.955367 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.056242 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.056206 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b122b28a-3083-417e-b44b-a28004869260-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.056409 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.056287 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b122b28a-3083-417e-b44b-a28004869260-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.056409 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.056316 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.056409 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.056335 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.056409 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.056370 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.056409 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.056399 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rz2tn\" (UniqueName: \"kubernetes.io/projected/b122b28a-3083-417e-b44b-a28004869260-kube-api-access-rz2tn\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.056660 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.056436 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b122b28a-3083-417e-b44b-a28004869260-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.056660 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.056472 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-config-volume\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.056660 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.056491 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.056660 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.056512 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b122b28a-3083-417e-b44b-a28004869260-config-out\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.056660 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.056526 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-web-config\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.056660 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.056550 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b122b28a-3083-417e-b44b-a28004869260-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.056660 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.056575 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.058096 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.057230 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b122b28a-3083-417e-b44b-a28004869260-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.058096 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.057301 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b122b28a-3083-417e-b44b-a28004869260-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.058096 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.057363 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b122b28a-3083-417e-b44b-a28004869260-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.059697 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.059676 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-config-volume\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.060055 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.060033 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-web-config\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.061252 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.061211 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.062091 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.062073 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b122b28a-3083-417e-b44b-a28004869260-config-out\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.062575 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.062551 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.062715 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.062695 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b122b28a-3083-417e-b44b-a28004869260-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.062806 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.062759 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.063060 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.063012 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.063373 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.063354 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.065243 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.065224 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz2tn\" (UniqueName: \"kubernetes.io/projected/b122b28a-3083-417e-b44b-a28004869260-kube-api-access-rz2tn\") pod \"alertmanager-main-0\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.173466 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.173423 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:47:17.490630 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.490588 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:47:17.495373 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:47:17.495335 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb122b28a_3083_417e_b44b_a28004869260.slice/crio-21709cb0c5354b03dabc1813910d39ccdf1d15d487e3e0cfb5f05f35ef13c8ec WatchSource:0}: Error finding container 21709cb0c5354b03dabc1813910d39ccdf1d15d487e3e0cfb5f05f35ef13c8ec: Status 404 returned error can't find the container with id 21709cb0c5354b03dabc1813910d39ccdf1d15d487e3e0cfb5f05f35ef13c8ec Mar 18 16:47:17.585485 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.585416 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" event={"ID":"03c91678-fee8-415c-9e15-85a3cedd1294","Type":"ContainerStarted","Data":"9b8f6e5dd04610e65adb22000547dfe06274c6b1ee62e2f26448e89daba5efb0"} Mar 18 16:47:17.585485 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.585462 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" event={"ID":"03c91678-fee8-415c-9e15-85a3cedd1294","Type":"ContainerStarted","Data":"4f5b42439d5a2575c0eb2e49a39eae7e536c77f59b3762f145cf67a936cbab50"} Mar 18 16:47:17.589390 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.589347 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b122b28a-3083-417e-b44b-a28004869260","Type":"ContainerStarted","Data":"21709cb0c5354b03dabc1813910d39ccdf1d15d487e3e0cfb5f05f35ef13c8ec"} Mar 18 16:47:17.591001 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:17.590937 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-82cfk" event={"ID":"0d4a41e6-6f72-45dd-a626-762b2819804c","Type":"ContainerStarted","Data":"1e886d2f49dd0b88a2b6e5904a0cf82a30201dade2db9750d1f13dcab6e041e4"} Mar 18 16:47:18.595589 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:18.595556 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" event={"ID":"03c91678-fee8-415c-9e15-85a3cedd1294","Type":"ContainerStarted","Data":"c7025eafe53ad112587c34eecf9544acaf945eee8efc1812c0be7612d2b639c5"} Mar 18 16:47:18.596706 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:18.596680 2571 generic.go:358] "Generic (PLEG): container finished" podID="0d4a41e6-6f72-45dd-a626-762b2819804c" containerID="bcd64d45f0b3ed57f884a3ee89123c2fa9010beceededb072de7391ae6bc02dd" exitCode=0 Mar 18 16:47:18.596813 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:18.596738 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-82cfk" event={"ID":"0d4a41e6-6f72-45dd-a626-762b2819804c","Type":"ContainerDied","Data":"bcd64d45f0b3ed57f884a3ee89123c2fa9010beceededb072de7391ae6bc02dd"} Mar 18 16:47:18.629725 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:18.629678 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-6df7999c47-z2nll" podStartSLOduration=2.43463297 podStartE2EDuration="3.629659125s" podCreationTimestamp="2026-03-18 16:47:15 +0000 UTC" firstStartedPulling="2026-03-18 16:47:16.181128396 +0000 UTC m=+176.711001555" lastFinishedPulling="2026-03-18 16:47:17.376154547 +0000 UTC m=+177.906027710" observedRunningTime="2026-03-18 16:47:18.628809279 +0000 UTC m=+179.158682463" watchObservedRunningTime="2026-03-18 16:47:18.629659125 +0000 UTC m=+179.159532307" Mar 18 16:47:19.600861 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:19.600823 2571 generic.go:358] "Generic (PLEG): container finished" podID="b122b28a-3083-417e-b44b-a28004869260" containerID="5078c6f31d820f534427621d0e7c75ae9778e9e1388361859dd76bde29b5379c" exitCode=0 Mar 18 16:47:19.601325 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:19.600928 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b122b28a-3083-417e-b44b-a28004869260","Type":"ContainerDied","Data":"5078c6f31d820f534427621d0e7c75ae9778e9e1388361859dd76bde29b5379c"} Mar 18 16:47:19.603102 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:19.603076 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-82cfk" event={"ID":"0d4a41e6-6f72-45dd-a626-762b2819804c","Type":"ContainerStarted","Data":"c5edf0f7fbfc8946b49851ec95821a479c14c183d112089617af278fdb1fa560"} Mar 18 16:47:19.603198 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:19.603111 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-82cfk" event={"ID":"0d4a41e6-6f72-45dd-a626-762b2819804c","Type":"ContainerStarted","Data":"7a44b3ec53783c016a7443062f69f24fd0cbc5629038dbb48a7159e8a5db122d"} Mar 18 16:47:19.690866 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:19.690816 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-82cfk" podStartSLOduration=3.5373825869999997 podStartE2EDuration="4.690802486s" podCreationTimestamp="2026-03-18 16:47:15 +0000 UTC" firstStartedPulling="2026-03-18 16:47:16.682090545 +0000 UTC m=+177.211963719" lastFinishedPulling="2026-03-18 16:47:17.835510452 +0000 UTC m=+178.365383618" observedRunningTime="2026-03-18 16:47:19.690071223 +0000 UTC m=+180.219944402" watchObservedRunningTime="2026-03-18 16:47:19.690802486 +0000 UTC m=+180.220675667" Mar 18 16:47:21.613120 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:21.613085 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b122b28a-3083-417e-b44b-a28004869260","Type":"ContainerStarted","Data":"581c00a06cd11d7b3e1941e955d3feab0df2acfb77705b6c532f0d6dccac456b"} Mar 18 16:47:21.613120 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:21.613121 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b122b28a-3083-417e-b44b-a28004869260","Type":"ContainerStarted","Data":"cb11d8c12458bfeb26aa8d6be6457bbb63edc5f6b8789009fcc0cb77688ac29d"} Mar 18 16:47:21.613496 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:21.613129 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b122b28a-3083-417e-b44b-a28004869260","Type":"ContainerStarted","Data":"c70d48fce16c88d5559f5f3e9c0d313200611ff9e70a1b357abbc1e6db117b90"} Mar 18 16:47:21.613496 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:21.613142 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b122b28a-3083-417e-b44b-a28004869260","Type":"ContainerStarted","Data":"e094ed37c4ba0a6b0865875e9ccdd41a4310408b5e018293b06bdc9b3573304e"} Mar 18 16:47:21.613496 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:21.613150 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b122b28a-3083-417e-b44b-a28004869260","Type":"ContainerStarted","Data":"2ce25126d43c3167b0f4d97db1f9eb8b92b5da87049dc4a3f3c76217a9949165"} Mar 18 16:47:22.564452 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.564417 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:47:22.573271 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.573234 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.579781 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.579719 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-l45b4lsc4n1n\"" Mar 18 16:47:22.579781 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.579757 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Mar 18 16:47:22.579920 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.579810 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Mar 18 16:47:22.579985 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.579921 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Mar 18 16:47:22.580175 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.580158 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Mar 18 16:47:22.580257 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.580217 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Mar 18 16:47:22.580335 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.580319 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Mar 18 16:47:22.580576 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.580545 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Mar 18 16:47:22.580839 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.580823 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Mar 18 16:47:22.580839 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.580831 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Mar 18 16:47:22.581102 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.581087 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-x5tmq\"" Mar 18 16:47:22.581102 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.581094 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Mar 18 16:47:22.584155 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.584122 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Mar 18 16:47:22.585621 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.585607 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Mar 18 16:47:22.592009 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.591966 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Mar 18 16:47:22.599323 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.599305 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.599412 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.599339 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.599412 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.599365 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.599412 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.599383 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qwfg\" (UniqueName: \"kubernetes.io/projected/10301ffa-6874-4472-b751-a9ed8dcc2293-kube-api-access-8qwfg\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.599517 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.599428 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.599517 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.599450 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.599517 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.599471 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.599517 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.599490 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.599517 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.599505 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-config\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.599687 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.599519 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/10301ffa-6874-4472-b751-a9ed8dcc2293-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.599687 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.599593 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10301ffa-6874-4472-b751-a9ed8dcc2293-config-out\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.599687 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.599619 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.599687 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.599652 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10301ffa-6874-4472-b751-a9ed8dcc2293-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.599687 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.599669 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.599862 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.599699 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.599862 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.599720 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-web-config\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.599862 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.599735 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.599862 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.599754 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.608115 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.608090 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:47:22.620229 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.620200 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b122b28a-3083-417e-b44b-a28004869260","Type":"ContainerStarted","Data":"1d4453a439428b8dcf09f7610c7154811a06b726f09fa79f8dce3002ca94afa4"} Mar 18 16:47:22.700184 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.700153 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.700184 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.700184 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-web-config\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.700437 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.700202 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.700437 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.700220 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.700437 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.700243 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.700585 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.700463 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.700585 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.700519 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.700585 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.700546 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qwfg\" (UniqueName: \"kubernetes.io/projected/10301ffa-6874-4472-b751-a9ed8dcc2293-kube-api-access-8qwfg\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.700585 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.700565 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.700791 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.700598 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.700791 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.700628 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.700791 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.700668 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.700791 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.700685 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-config\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.701728 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.701663 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.701845 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.701793 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.701906 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.701843 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/10301ffa-6874-4472-b751-a9ed8dcc2293-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.702006 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.701967 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10301ffa-6874-4472-b751-a9ed8dcc2293-config-out\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.702076 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.702022 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.702127 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.702090 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10301ffa-6874-4472-b751-a9ed8dcc2293-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.702127 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.702116 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.702222 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.702136 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.702899 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.702875 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/10301ffa-6874-4472-b751-a9ed8dcc2293-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.703193 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.703165 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.704416 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.704390 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.704416 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.704421 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-web-config\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.704814 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.704791 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.705019 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.704995 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-config\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.705296 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.705255 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10301ffa-6874-4472-b751-a9ed8dcc2293-config-out\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.706593 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.706568 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.707222 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.707182 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.707376 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.707271 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.707679 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.707655 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.707822 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.707690 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.709695 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.709671 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.710245 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.710228 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10301ffa-6874-4472-b751-a9ed8dcc2293-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.730213 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.730187 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qwfg\" (UniqueName: \"kubernetes.io/projected/10301ffa-6874-4472-b751-a9ed8dcc2293-kube-api-access-8qwfg\") pod \"prometheus-k8s-0\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:22.753145 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.753099 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.109759099 podStartE2EDuration="6.753084651s" podCreationTimestamp="2026-03-18 16:47:16 +0000 UTC" firstStartedPulling="2026-03-18 16:47:17.49869847 +0000 UTC m=+178.028571630" lastFinishedPulling="2026-03-18 16:47:22.142024019 +0000 UTC m=+182.671897182" observedRunningTime="2026-03-18 16:47:22.751535974 +0000 UTC m=+183.281409156" watchObservedRunningTime="2026-03-18 16:47:22.753084651 +0000 UTC m=+183.282957831" Mar 18 16:47:22.882745 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:22.882648 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:23.062572 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:23.062549 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:47:23.064139 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:47:23.064111 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10301ffa_6874_4472_b751_a9ed8dcc2293.slice/crio-53cfd455b0beebc282bc12421d44e30a98f36e6a3a68607cd83c855a08d2698f WatchSource:0}: Error finding container 53cfd455b0beebc282bc12421d44e30a98f36e6a3a68607cd83c855a08d2698f: Status 404 returned error can't find the container with id 53cfd455b0beebc282bc12421d44e30a98f36e6a3a68607cd83c855a08d2698f Mar 18 16:47:23.624561 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:23.624530 2571 generic.go:358] "Generic (PLEG): container finished" podID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerID="ddd17028fe4e8136c8a3913e4ee100ad628f389b29a5b9307a76aa03bfdab6fb" exitCode=0 Mar 18 16:47:23.624965 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:23.624628 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10301ffa-6874-4472-b751-a9ed8dcc2293","Type":"ContainerDied","Data":"ddd17028fe4e8136c8a3913e4ee100ad628f389b29a5b9307a76aa03bfdab6fb"} Mar 18 16:47:23.624965 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:23.624673 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10301ffa-6874-4472-b751-a9ed8dcc2293","Type":"ContainerStarted","Data":"53cfd455b0beebc282bc12421d44e30a98f36e6a3a68607cd83c855a08d2698f"} Mar 18 16:47:26.635776 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:26.635692 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10301ffa-6874-4472-b751-a9ed8dcc2293","Type":"ContainerStarted","Data":"55d14751ad08d7ee9e7970bf1e0780716b60441cd21392c20d7794f332d4fb85"} Mar 18 16:47:26.635776 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:26.635729 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10301ffa-6874-4472-b751-a9ed8dcc2293","Type":"ContainerStarted","Data":"b2e20a6a7d766fe42c81e130b003b79a894fbdbf436d5b34661ef9548e1b13cd"} Mar 18 16:47:28.644940 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:28.644909 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10301ffa-6874-4472-b751-a9ed8dcc2293","Type":"ContainerStarted","Data":"af54614e2cf5afb973074a22c1b60c9ca499c8cea68e7a8630a3d418343bf2d5"} Mar 18 16:47:28.644940 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:28.644941 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10301ffa-6874-4472-b751-a9ed8dcc2293","Type":"ContainerStarted","Data":"58303a8e94d369291b38047d9dbd413671ee10a9a63f32606f0a87f9c8382b42"} Mar 18 16:47:28.645387 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:28.644950 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10301ffa-6874-4472-b751-a9ed8dcc2293","Type":"ContainerStarted","Data":"e61bbeae6e6a5d174029f9f31bb096d25f67e518d521142814ad5f11b627f944"} Mar 18 16:47:28.645387 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:28.644959 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10301ffa-6874-4472-b751-a9ed8dcc2293","Type":"ContainerStarted","Data":"c92ffb079258ebf23ae7ea4114b40b66479ed2a68842715de5472fa766960c88"} Mar 18 16:47:28.675173 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:28.675120 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.292275144 podStartE2EDuration="6.675106079s" podCreationTimestamp="2026-03-18 16:47:22 +0000 UTC" firstStartedPulling="2026-03-18 16:47:23.625700342 +0000 UTC m=+184.155573501" lastFinishedPulling="2026-03-18 16:47:28.008531274 +0000 UTC m=+188.538404436" observedRunningTime="2026-03-18 16:47:28.6729266 +0000 UTC m=+189.202799775" watchObservedRunningTime="2026-03-18 16:47:28.675106079 +0000 UTC m=+189.204979259" Mar 18 16:47:32.883429 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:32.883397 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:41.683698 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:41.683664 2571 generic.go:358] "Generic (PLEG): container finished" podID="92e8b58c-5235-4208-9c7e-f4ff8c45861a" containerID="d5d2aa21d30b03a93a7fef2749c1a7e8bc89c76c912d8e2b1f3051e437d9ebc1" exitCode=0 Mar 18 16:47:41.684133 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:41.683739 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g" event={"ID":"92e8b58c-5235-4208-9c7e-f4ff8c45861a","Type":"ContainerDied","Data":"d5d2aa21d30b03a93a7fef2749c1a7e8bc89c76c912d8e2b1f3051e437d9ebc1"} Mar 18 16:47:41.684133 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:41.684099 2571 scope.go:117] "RemoveContainer" containerID="d5d2aa21d30b03a93a7fef2749c1a7e8bc89c76c912d8e2b1f3051e437d9ebc1" Mar 18 16:47:42.687457 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:42.687424 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-lw72g" event={"ID":"92e8b58c-5235-4208-9c7e-f4ff8c45861a","Type":"ContainerStarted","Data":"8ea0d1fd433690713088143f8380e9ad99904fc9dceb2d4fa0fcc3a3ae8c3f8b"} Mar 18 16:47:56.729316 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:56.729278 2571 generic.go:358] "Generic (PLEG): container finished" podID="71b1e3b5-f3de-4e5a-855d-d72d883b476f" containerID="32c79ecb18554e4237832b264fef3940131a693072f06a3409385ceae0f705a2" exitCode=0 Mar 18 16:47:56.729705 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:56.729348 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf" event={"ID":"71b1e3b5-f3de-4e5a-855d-d72d883b476f","Type":"ContainerDied","Data":"32c79ecb18554e4237832b264fef3940131a693072f06a3409385ceae0f705a2"} Mar 18 16:47:56.729705 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:56.729655 2571 scope.go:117] "RemoveContainer" containerID="32c79ecb18554e4237832b264fef3940131a693072f06a3409385ceae0f705a2" Mar 18 16:47:57.733646 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:47:57.733616 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-mr5cf" event={"ID":"71b1e3b5-f3de-4e5a-855d-d72d883b476f","Type":"ContainerStarted","Data":"b4ad36908f461a4efd10d58abd1a06926e6607df4f017abe6fd9b62c7d8f0e16"} Mar 18 16:48:22.883250 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:22.883211 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:22.898810 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:22.898781 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:23.838113 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:23.838084 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:31.797541 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:31.797441 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs\") pod \"network-metrics-daemon-jf76n\" (UID: \"bf79a021-e091-4ae2-bd19-2bd1205de781\") " pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:48:31.799709 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:31.799676 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf79a021-e091-4ae2-bd19-2bd1205de781-metrics-certs\") pod \"network-metrics-daemon-jf76n\" (UID: \"bf79a021-e091-4ae2-bd19-2bd1205de781\") " pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:48:31.825097 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:31.825062 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vrjl4\"" Mar 18 16:48:31.833677 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:31.833655 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jf76n" Mar 18 16:48:31.952450 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:31.952425 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jf76n"] Mar 18 16:48:31.955108 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:48:31.955082 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf79a021_e091_4ae2_bd19_2bd1205de781.slice/crio-b26e72d886d0f802f859d8d0ae4279208c8627104720fb4a693da2adb7967e08 WatchSource:0}: Error finding container b26e72d886d0f802f859d8d0ae4279208c8627104720fb4a693da2adb7967e08: Status 404 returned error can't find the container with id b26e72d886d0f802f859d8d0ae4279208c8627104720fb4a693da2adb7967e08 Mar 18 16:48:32.851288 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:32.851252 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jf76n" event={"ID":"bf79a021-e091-4ae2-bd19-2bd1205de781","Type":"ContainerStarted","Data":"b26e72d886d0f802f859d8d0ae4279208c8627104720fb4a693da2adb7967e08"} Mar 18 16:48:33.855955 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:33.855922 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jf76n" event={"ID":"bf79a021-e091-4ae2-bd19-2bd1205de781","Type":"ContainerStarted","Data":"7c7e4ffdeb038bdf2114eef9c23963cd7a046acc810e8f3c587f0d036c60a9b6"} Mar 18 16:48:33.855955 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:33.855960 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jf76n" event={"ID":"bf79a021-e091-4ae2-bd19-2bd1205de781","Type":"ContainerStarted","Data":"52d43521fb686a2a29a2f99c04584f1040e39426893d573623ecf95531be5ac3"} Mar 18 16:48:33.871684 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:33.871634 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jf76n" podStartSLOduration=252.952121097 podStartE2EDuration="4m13.871620107s" podCreationTimestamp="2026-03-18 16:44:20 +0000 UTC" firstStartedPulling="2026-03-18 16:48:31.95728493 +0000 UTC m=+252.487158093" lastFinishedPulling="2026-03-18 16:48:32.876783941 +0000 UTC m=+253.406657103" observedRunningTime="2026-03-18 16:48:33.870144781 +0000 UTC m=+254.400017964" watchObservedRunningTime="2026-03-18 16:48:33.871620107 +0000 UTC m=+254.401493289" Mar 18 16:48:36.325946 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:36.325911 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:48:36.326392 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:36.326368 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="alertmanager" containerID="cri-o://2ce25126d43c3167b0f4d97db1f9eb8b92b5da87049dc4a3f3c76217a9949165" gracePeriod=120 Mar 18 16:48:36.326470 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:36.326436 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="kube-rbac-proxy-metric" containerID="cri-o://581c00a06cd11d7b3e1941e955d3feab0df2acfb77705b6c532f0d6dccac456b" gracePeriod=120 Mar 18 16:48:36.326539 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:36.326469 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="kube-rbac-proxy" containerID="cri-o://cb11d8c12458bfeb26aa8d6be6457bbb63edc5f6b8789009fcc0cb77688ac29d" gracePeriod=120 Mar 18 16:48:36.326539 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:36.326457 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="config-reloader" containerID="cri-o://e094ed37c4ba0a6b0865875e9ccdd41a4310408b5e018293b06bdc9b3573304e" gracePeriod=120 Mar 18 16:48:36.326539 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:36.326488 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="prom-label-proxy" containerID="cri-o://1d4453a439428b8dcf09f7610c7154811a06b726f09fa79f8dce3002ca94afa4" gracePeriod=120 Mar 18 16:48:36.326539 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:36.326442 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="kube-rbac-proxy-web" containerID="cri-o://c70d48fce16c88d5559f5f3e9c0d313200611ff9e70a1b357abbc1e6db117b90" gracePeriod=120 Mar 18 16:48:36.869025 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:36.868994 2571 generic.go:358] "Generic (PLEG): container finished" podID="b122b28a-3083-417e-b44b-a28004869260" containerID="1d4453a439428b8dcf09f7610c7154811a06b726f09fa79f8dce3002ca94afa4" exitCode=0 Mar 18 16:48:36.869025 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:36.869023 2571 generic.go:358] "Generic (PLEG): container finished" podID="b122b28a-3083-417e-b44b-a28004869260" containerID="cb11d8c12458bfeb26aa8d6be6457bbb63edc5f6b8789009fcc0cb77688ac29d" exitCode=0 Mar 18 16:48:36.869025 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:36.869029 2571 generic.go:358] "Generic (PLEG): container finished" podID="b122b28a-3083-417e-b44b-a28004869260" containerID="e094ed37c4ba0a6b0865875e9ccdd41a4310408b5e018293b06bdc9b3573304e" exitCode=0 Mar 18 16:48:36.869025 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:36.869035 2571 generic.go:358] "Generic (PLEG): container finished" podID="b122b28a-3083-417e-b44b-a28004869260" containerID="2ce25126d43c3167b0f4d97db1f9eb8b92b5da87049dc4a3f3c76217a9949165" exitCode=0 Mar 18 16:48:36.869307 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:36.869002 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b122b28a-3083-417e-b44b-a28004869260","Type":"ContainerDied","Data":"1d4453a439428b8dcf09f7610c7154811a06b726f09fa79f8dce3002ca94afa4"} Mar 18 16:48:36.869307 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:36.869107 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b122b28a-3083-417e-b44b-a28004869260","Type":"ContainerDied","Data":"cb11d8c12458bfeb26aa8d6be6457bbb63edc5f6b8789009fcc0cb77688ac29d"} Mar 18 16:48:36.869307 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:36.869117 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b122b28a-3083-417e-b44b-a28004869260","Type":"ContainerDied","Data":"e094ed37c4ba0a6b0865875e9ccdd41a4310408b5e018293b06bdc9b3573304e"} Mar 18 16:48:36.869307 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:36.869126 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b122b28a-3083-417e-b44b-a28004869260","Type":"ContainerDied","Data":"2ce25126d43c3167b0f4d97db1f9eb8b92b5da87049dc4a3f3c76217a9949165"} Mar 18 16:48:37.567989 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.567955 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:37.645596 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.645568 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-config-volume\") pod \"b122b28a-3083-417e-b44b-a28004869260\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " Mar 18 16:48:37.645755 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.645602 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-main-tls\") pod \"b122b28a-3083-417e-b44b-a28004869260\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " Mar 18 16:48:37.645755 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.645650 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-kube-rbac-proxy-metric\") pod \"b122b28a-3083-417e-b44b-a28004869260\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " Mar 18 16:48:37.645755 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.645695 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b122b28a-3083-417e-b44b-a28004869260-alertmanager-trusted-ca-bundle\") pod \"b122b28a-3083-417e-b44b-a28004869260\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " Mar 18 16:48:37.645755 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.645742 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b122b28a-3083-417e-b44b-a28004869260-alertmanager-main-db\") pod \"b122b28a-3083-417e-b44b-a28004869260\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " Mar 18 16:48:37.645951 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.645778 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-kube-rbac-proxy-web\") pod \"b122b28a-3083-417e-b44b-a28004869260\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " Mar 18 16:48:37.645951 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.645812 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-web-config\") pod \"b122b28a-3083-417e-b44b-a28004869260\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " Mar 18 16:48:37.645951 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.645843 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b122b28a-3083-417e-b44b-a28004869260-tls-assets\") pod \"b122b28a-3083-417e-b44b-a28004869260\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " Mar 18 16:48:37.645951 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.645869 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-cluster-tls-config\") pod \"b122b28a-3083-417e-b44b-a28004869260\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " Mar 18 16:48:37.645951 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.645897 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz2tn\" (UniqueName: \"kubernetes.io/projected/b122b28a-3083-417e-b44b-a28004869260-kube-api-access-rz2tn\") pod \"b122b28a-3083-417e-b44b-a28004869260\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " Mar 18 16:48:37.645951 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.645926 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b122b28a-3083-417e-b44b-a28004869260-metrics-client-ca\") pod \"b122b28a-3083-417e-b44b-a28004869260\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " Mar 18 16:48:37.646255 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.645958 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-kube-rbac-proxy\") pod \"b122b28a-3083-417e-b44b-a28004869260\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " Mar 18 16:48:37.646255 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.646026 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b122b28a-3083-417e-b44b-a28004869260-config-out\") pod \"b122b28a-3083-417e-b44b-a28004869260\" (UID: \"b122b28a-3083-417e-b44b-a28004869260\") " Mar 18 16:48:37.646255 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.646155 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b122b28a-3083-417e-b44b-a28004869260-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "b122b28a-3083-417e-b44b-a28004869260" (UID: "b122b28a-3083-417e-b44b-a28004869260"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:48:37.646255 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.646212 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b122b28a-3083-417e-b44b-a28004869260-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "b122b28a-3083-417e-b44b-a28004869260" (UID: "b122b28a-3083-417e-b44b-a28004869260"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:37.646457 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.646321 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b122b28a-3083-417e-b44b-a28004869260-alertmanager-main-db\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:37.646457 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.646341 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b122b28a-3083-417e-b44b-a28004869260-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:37.647889 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.647852 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b122b28a-3083-417e-b44b-a28004869260-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "b122b28a-3083-417e-b44b-a28004869260" (UID: "b122b28a-3083-417e-b44b-a28004869260"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:37.648925 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.648876 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "b122b28a-3083-417e-b44b-a28004869260" (UID: "b122b28a-3083-417e-b44b-a28004869260"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:37.649213 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.649163 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "b122b28a-3083-417e-b44b-a28004869260" (UID: "b122b28a-3083-417e-b44b-a28004869260"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:37.649431 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.649384 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b122b28a-3083-417e-b44b-a28004869260-kube-api-access-rz2tn" (OuterVolumeSpecName: "kube-api-access-rz2tn") pod "b122b28a-3083-417e-b44b-a28004869260" (UID: "b122b28a-3083-417e-b44b-a28004869260"). InnerVolumeSpecName "kube-api-access-rz2tn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:48:37.649528 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.649451 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "b122b28a-3083-417e-b44b-a28004869260" (UID: "b122b28a-3083-417e-b44b-a28004869260"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:37.649591 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.649566 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-config-volume" (OuterVolumeSpecName: "config-volume") pod "b122b28a-3083-417e-b44b-a28004869260" (UID: "b122b28a-3083-417e-b44b-a28004869260"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:37.649648 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.649582 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b122b28a-3083-417e-b44b-a28004869260-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "b122b28a-3083-417e-b44b-a28004869260" (UID: "b122b28a-3083-417e-b44b-a28004869260"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:48:37.649882 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.649857 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "b122b28a-3083-417e-b44b-a28004869260" (UID: "b122b28a-3083-417e-b44b-a28004869260"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:37.649938 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.649909 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b122b28a-3083-417e-b44b-a28004869260-config-out" (OuterVolumeSpecName: "config-out") pod "b122b28a-3083-417e-b44b-a28004869260" (UID: "b122b28a-3083-417e-b44b-a28004869260"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:48:37.653292 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.653268 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "b122b28a-3083-417e-b44b-a28004869260" (UID: "b122b28a-3083-417e-b44b-a28004869260"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:37.659810 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.659754 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-web-config" (OuterVolumeSpecName: "web-config") pod "b122b28a-3083-417e-b44b-a28004869260" (UID: "b122b28a-3083-417e-b44b-a28004869260"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:37.746824 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.746785 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:37.746824 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.746823 2571 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-web-config\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:37.746824 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.746835 2571 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b122b28a-3083-417e-b44b-a28004869260-tls-assets\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:37.747116 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.746847 2571 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-cluster-tls-config\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:37.747116 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.746859 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rz2tn\" (UniqueName: \"kubernetes.io/projected/b122b28a-3083-417e-b44b-a28004869260-kube-api-access-rz2tn\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:37.747116 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.746871 2571 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b122b28a-3083-417e-b44b-a28004869260-metrics-client-ca\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:37.747116 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.746884 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:37.747116 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.746897 2571 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b122b28a-3083-417e-b44b-a28004869260-config-out\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:37.747116 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.746909 2571 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-config-volume\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:37.747116 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.746922 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-main-tls\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:37.747116 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.746937 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b122b28a-3083-417e-b44b-a28004869260-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:37.874884 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.874851 2571 generic.go:358] "Generic (PLEG): container finished" podID="b122b28a-3083-417e-b44b-a28004869260" containerID="581c00a06cd11d7b3e1941e955d3feab0df2acfb77705b6c532f0d6dccac456b" exitCode=0 Mar 18 16:48:37.874884 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.874878 2571 generic.go:358] "Generic (PLEG): container finished" podID="b122b28a-3083-417e-b44b-a28004869260" containerID="c70d48fce16c88d5559f5f3e9c0d313200611ff9e70a1b357abbc1e6db117b90" exitCode=0 Mar 18 16:48:37.875092 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.874933 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b122b28a-3083-417e-b44b-a28004869260","Type":"ContainerDied","Data":"581c00a06cd11d7b3e1941e955d3feab0df2acfb77705b6c532f0d6dccac456b"} Mar 18 16:48:37.875092 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.874966 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:37.875092 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.875005 2571 scope.go:117] "RemoveContainer" containerID="1d4453a439428b8dcf09f7610c7154811a06b726f09fa79f8dce3002ca94afa4" Mar 18 16:48:37.875204 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.874992 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b122b28a-3083-417e-b44b-a28004869260","Type":"ContainerDied","Data":"c70d48fce16c88d5559f5f3e9c0d313200611ff9e70a1b357abbc1e6db117b90"} Mar 18 16:48:37.875204 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.875137 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b122b28a-3083-417e-b44b-a28004869260","Type":"ContainerDied","Data":"21709cb0c5354b03dabc1813910d39ccdf1d15d487e3e0cfb5f05f35ef13c8ec"} Mar 18 16:48:37.882768 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.882747 2571 scope.go:117] "RemoveContainer" containerID="581c00a06cd11d7b3e1941e955d3feab0df2acfb77705b6c532f0d6dccac456b" Mar 18 16:48:37.893198 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.893179 2571 scope.go:117] "RemoveContainer" containerID="cb11d8c12458bfeb26aa8d6be6457bbb63edc5f6b8789009fcc0cb77688ac29d" Mar 18 16:48:37.900392 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.900369 2571 scope.go:117] "RemoveContainer" containerID="c70d48fce16c88d5559f5f3e9c0d313200611ff9e70a1b357abbc1e6db117b90" Mar 18 16:48:37.903392 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.903371 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:48:37.908820 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.908796 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:48:37.909147 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.909131 2571 scope.go:117] "RemoveContainer" containerID="e094ed37c4ba0a6b0865875e9ccdd41a4310408b5e018293b06bdc9b3573304e" Mar 18 16:48:37.916325 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.916302 2571 scope.go:117] "RemoveContainer" containerID="2ce25126d43c3167b0f4d97db1f9eb8b92b5da87049dc4a3f3c76217a9949165" Mar 18 16:48:37.922962 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.922946 2571 scope.go:117] "RemoveContainer" containerID="5078c6f31d820f534427621d0e7c75ae9778e9e1388361859dd76bde29b5379c" Mar 18 16:48:37.929579 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.929563 2571 scope.go:117] "RemoveContainer" containerID="1d4453a439428b8dcf09f7610c7154811a06b726f09fa79f8dce3002ca94afa4" Mar 18 16:48:37.929832 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:48:37.929813 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d4453a439428b8dcf09f7610c7154811a06b726f09fa79f8dce3002ca94afa4\": container with ID starting with 1d4453a439428b8dcf09f7610c7154811a06b726f09fa79f8dce3002ca94afa4 not found: ID does not exist" containerID="1d4453a439428b8dcf09f7610c7154811a06b726f09fa79f8dce3002ca94afa4" Mar 18 16:48:37.929876 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.929840 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d4453a439428b8dcf09f7610c7154811a06b726f09fa79f8dce3002ca94afa4"} err="failed to get container status \"1d4453a439428b8dcf09f7610c7154811a06b726f09fa79f8dce3002ca94afa4\": rpc error: code = NotFound desc = could not find container \"1d4453a439428b8dcf09f7610c7154811a06b726f09fa79f8dce3002ca94afa4\": container with ID starting with 1d4453a439428b8dcf09f7610c7154811a06b726f09fa79f8dce3002ca94afa4 not found: ID does not exist" Mar 18 16:48:37.929876 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.929873 2571 scope.go:117] "RemoveContainer" containerID="581c00a06cd11d7b3e1941e955d3feab0df2acfb77705b6c532f0d6dccac456b" Mar 18 16:48:37.930198 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:48:37.930175 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"581c00a06cd11d7b3e1941e955d3feab0df2acfb77705b6c532f0d6dccac456b\": container with ID starting with 581c00a06cd11d7b3e1941e955d3feab0df2acfb77705b6c532f0d6dccac456b not found: ID does not exist" containerID="581c00a06cd11d7b3e1941e955d3feab0df2acfb77705b6c532f0d6dccac456b" Mar 18 16:48:37.930367 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.930199 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"581c00a06cd11d7b3e1941e955d3feab0df2acfb77705b6c532f0d6dccac456b"} err="failed to get container status \"581c00a06cd11d7b3e1941e955d3feab0df2acfb77705b6c532f0d6dccac456b\": rpc error: code = NotFound desc = could not find container \"581c00a06cd11d7b3e1941e955d3feab0df2acfb77705b6c532f0d6dccac456b\": container with ID starting with 581c00a06cd11d7b3e1941e955d3feab0df2acfb77705b6c532f0d6dccac456b not found: ID does not exist" Mar 18 16:48:37.930367 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.930217 2571 scope.go:117] "RemoveContainer" containerID="cb11d8c12458bfeb26aa8d6be6457bbb63edc5f6b8789009fcc0cb77688ac29d" Mar 18 16:48:37.930754 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:48:37.930718 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb11d8c12458bfeb26aa8d6be6457bbb63edc5f6b8789009fcc0cb77688ac29d\": container with ID starting with cb11d8c12458bfeb26aa8d6be6457bbb63edc5f6b8789009fcc0cb77688ac29d not found: ID does not exist" containerID="cb11d8c12458bfeb26aa8d6be6457bbb63edc5f6b8789009fcc0cb77688ac29d" Mar 18 16:48:37.930830 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.930751 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb11d8c12458bfeb26aa8d6be6457bbb63edc5f6b8789009fcc0cb77688ac29d"} err="failed to get container status \"cb11d8c12458bfeb26aa8d6be6457bbb63edc5f6b8789009fcc0cb77688ac29d\": rpc error: code = NotFound desc = could not find container \"cb11d8c12458bfeb26aa8d6be6457bbb63edc5f6b8789009fcc0cb77688ac29d\": container with ID starting with cb11d8c12458bfeb26aa8d6be6457bbb63edc5f6b8789009fcc0cb77688ac29d not found: ID does not exist" Mar 18 16:48:37.930830 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.930776 2571 scope.go:117] "RemoveContainer" containerID="c70d48fce16c88d5559f5f3e9c0d313200611ff9e70a1b357abbc1e6db117b90" Mar 18 16:48:37.931053 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:48:37.931036 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c70d48fce16c88d5559f5f3e9c0d313200611ff9e70a1b357abbc1e6db117b90\": container with ID starting with c70d48fce16c88d5559f5f3e9c0d313200611ff9e70a1b357abbc1e6db117b90 not found: ID does not exist" containerID="c70d48fce16c88d5559f5f3e9c0d313200611ff9e70a1b357abbc1e6db117b90" Mar 18 16:48:37.931095 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.931066 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c70d48fce16c88d5559f5f3e9c0d313200611ff9e70a1b357abbc1e6db117b90"} err="failed to get container status \"c70d48fce16c88d5559f5f3e9c0d313200611ff9e70a1b357abbc1e6db117b90\": rpc error: code = NotFound desc = could not find container \"c70d48fce16c88d5559f5f3e9c0d313200611ff9e70a1b357abbc1e6db117b90\": container with ID starting with c70d48fce16c88d5559f5f3e9c0d313200611ff9e70a1b357abbc1e6db117b90 not found: ID does not exist" Mar 18 16:48:37.931095 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.931085 2571 scope.go:117] "RemoveContainer" containerID="e094ed37c4ba0a6b0865875e9ccdd41a4310408b5e018293b06bdc9b3573304e" Mar 18 16:48:37.931344 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:48:37.931327 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e094ed37c4ba0a6b0865875e9ccdd41a4310408b5e018293b06bdc9b3573304e\": container with ID starting with e094ed37c4ba0a6b0865875e9ccdd41a4310408b5e018293b06bdc9b3573304e not found: ID does not exist" containerID="e094ed37c4ba0a6b0865875e9ccdd41a4310408b5e018293b06bdc9b3573304e" Mar 18 16:48:37.931389 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.931348 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e094ed37c4ba0a6b0865875e9ccdd41a4310408b5e018293b06bdc9b3573304e"} err="failed to get container status \"e094ed37c4ba0a6b0865875e9ccdd41a4310408b5e018293b06bdc9b3573304e\": rpc error: code = NotFound desc = could not find container \"e094ed37c4ba0a6b0865875e9ccdd41a4310408b5e018293b06bdc9b3573304e\": container with ID starting with e094ed37c4ba0a6b0865875e9ccdd41a4310408b5e018293b06bdc9b3573304e not found: ID does not exist" Mar 18 16:48:37.931389 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.931362 2571 scope.go:117] "RemoveContainer" containerID="2ce25126d43c3167b0f4d97db1f9eb8b92b5da87049dc4a3f3c76217a9949165" Mar 18 16:48:37.931570 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:48:37.931545 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ce25126d43c3167b0f4d97db1f9eb8b92b5da87049dc4a3f3c76217a9949165\": container with ID starting with 2ce25126d43c3167b0f4d97db1f9eb8b92b5da87049dc4a3f3c76217a9949165 not found: ID does not exist" containerID="2ce25126d43c3167b0f4d97db1f9eb8b92b5da87049dc4a3f3c76217a9949165" Mar 18 16:48:37.931622 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.931572 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce25126d43c3167b0f4d97db1f9eb8b92b5da87049dc4a3f3c76217a9949165"} err="failed to get container status \"2ce25126d43c3167b0f4d97db1f9eb8b92b5da87049dc4a3f3c76217a9949165\": rpc error: code = NotFound desc = could not find container \"2ce25126d43c3167b0f4d97db1f9eb8b92b5da87049dc4a3f3c76217a9949165\": container with ID starting with 2ce25126d43c3167b0f4d97db1f9eb8b92b5da87049dc4a3f3c76217a9949165 not found: ID does not exist" Mar 18 16:48:37.931622 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.931584 2571 scope.go:117] "RemoveContainer" containerID="5078c6f31d820f534427621d0e7c75ae9778e9e1388361859dd76bde29b5379c" Mar 18 16:48:37.931824 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:48:37.931808 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5078c6f31d820f534427621d0e7c75ae9778e9e1388361859dd76bde29b5379c\": container with ID starting with 5078c6f31d820f534427621d0e7c75ae9778e9e1388361859dd76bde29b5379c not found: ID does not exist" containerID="5078c6f31d820f534427621d0e7c75ae9778e9e1388361859dd76bde29b5379c" Mar 18 16:48:37.931864 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.931829 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5078c6f31d820f534427621d0e7c75ae9778e9e1388361859dd76bde29b5379c"} err="failed to get container status \"5078c6f31d820f534427621d0e7c75ae9778e9e1388361859dd76bde29b5379c\": rpc error: code = NotFound desc = could not find container \"5078c6f31d820f534427621d0e7c75ae9778e9e1388361859dd76bde29b5379c\": container with ID starting with 5078c6f31d820f534427621d0e7c75ae9778e9e1388361859dd76bde29b5379c not found: ID does not exist" Mar 18 16:48:37.931864 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.931845 2571 scope.go:117] "RemoveContainer" containerID="1d4453a439428b8dcf09f7610c7154811a06b726f09fa79f8dce3002ca94afa4" Mar 18 16:48:37.932050 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.932029 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d4453a439428b8dcf09f7610c7154811a06b726f09fa79f8dce3002ca94afa4"} err="failed to get container status \"1d4453a439428b8dcf09f7610c7154811a06b726f09fa79f8dce3002ca94afa4\": rpc error: code = NotFound desc = could not find container \"1d4453a439428b8dcf09f7610c7154811a06b726f09fa79f8dce3002ca94afa4\": container with ID starting with 1d4453a439428b8dcf09f7610c7154811a06b726f09fa79f8dce3002ca94afa4 not found: ID does not exist" Mar 18 16:48:37.932111 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.932052 2571 scope.go:117] "RemoveContainer" containerID="581c00a06cd11d7b3e1941e955d3feab0df2acfb77705b6c532f0d6dccac456b" Mar 18 16:48:37.932286 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.932269 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"581c00a06cd11d7b3e1941e955d3feab0df2acfb77705b6c532f0d6dccac456b"} err="failed to get container status \"581c00a06cd11d7b3e1941e955d3feab0df2acfb77705b6c532f0d6dccac456b\": rpc error: code = NotFound desc = could not find container \"581c00a06cd11d7b3e1941e955d3feab0df2acfb77705b6c532f0d6dccac456b\": container with ID starting with 581c00a06cd11d7b3e1941e955d3feab0df2acfb77705b6c532f0d6dccac456b not found: ID does not exist" Mar 18 16:48:37.932335 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.932288 2571 scope.go:117] "RemoveContainer" containerID="cb11d8c12458bfeb26aa8d6be6457bbb63edc5f6b8789009fcc0cb77688ac29d" Mar 18 16:48:37.932494 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.932480 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb11d8c12458bfeb26aa8d6be6457bbb63edc5f6b8789009fcc0cb77688ac29d"} err="failed to get container status \"cb11d8c12458bfeb26aa8d6be6457bbb63edc5f6b8789009fcc0cb77688ac29d\": rpc error: code = NotFound desc = could not find container \"cb11d8c12458bfeb26aa8d6be6457bbb63edc5f6b8789009fcc0cb77688ac29d\": container with ID starting with cb11d8c12458bfeb26aa8d6be6457bbb63edc5f6b8789009fcc0cb77688ac29d not found: ID does not exist" Mar 18 16:48:37.932552 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.932495 2571 scope.go:117] "RemoveContainer" containerID="c70d48fce16c88d5559f5f3e9c0d313200611ff9e70a1b357abbc1e6db117b90" Mar 18 16:48:37.932713 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.932694 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c70d48fce16c88d5559f5f3e9c0d313200611ff9e70a1b357abbc1e6db117b90"} err="failed to get container status \"c70d48fce16c88d5559f5f3e9c0d313200611ff9e70a1b357abbc1e6db117b90\": rpc error: code = NotFound desc = could not find container \"c70d48fce16c88d5559f5f3e9c0d313200611ff9e70a1b357abbc1e6db117b90\": container with ID starting with c70d48fce16c88d5559f5f3e9c0d313200611ff9e70a1b357abbc1e6db117b90 not found: ID does not exist" Mar 18 16:48:37.932756 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.932714 2571 scope.go:117] "RemoveContainer" containerID="e094ed37c4ba0a6b0865875e9ccdd41a4310408b5e018293b06bdc9b3573304e" Mar 18 16:48:37.932910 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.932893 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e094ed37c4ba0a6b0865875e9ccdd41a4310408b5e018293b06bdc9b3573304e"} err="failed to get container status \"e094ed37c4ba0a6b0865875e9ccdd41a4310408b5e018293b06bdc9b3573304e\": rpc error: code = NotFound desc = could not find container \"e094ed37c4ba0a6b0865875e9ccdd41a4310408b5e018293b06bdc9b3573304e\": container with ID starting with e094ed37c4ba0a6b0865875e9ccdd41a4310408b5e018293b06bdc9b3573304e not found: ID does not exist" Mar 18 16:48:37.932952 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.932911 2571 scope.go:117] "RemoveContainer" containerID="2ce25126d43c3167b0f4d97db1f9eb8b92b5da87049dc4a3f3c76217a9949165" Mar 18 16:48:37.933127 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.933105 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce25126d43c3167b0f4d97db1f9eb8b92b5da87049dc4a3f3c76217a9949165"} err="failed to get container status \"2ce25126d43c3167b0f4d97db1f9eb8b92b5da87049dc4a3f3c76217a9949165\": rpc error: code = NotFound desc = could not find container \"2ce25126d43c3167b0f4d97db1f9eb8b92b5da87049dc4a3f3c76217a9949165\": container with ID starting with 2ce25126d43c3167b0f4d97db1f9eb8b92b5da87049dc4a3f3c76217a9949165 not found: ID does not exist" Mar 18 16:48:37.933127 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.933122 2571 scope.go:117] "RemoveContainer" containerID="5078c6f31d820f534427621d0e7c75ae9778e9e1388361859dd76bde29b5379c" Mar 18 16:48:37.933334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.933317 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5078c6f31d820f534427621d0e7c75ae9778e9e1388361859dd76bde29b5379c"} err="failed to get container status \"5078c6f31d820f534427621d0e7c75ae9778e9e1388361859dd76bde29b5379c\": rpc error: code = NotFound desc = could not find container \"5078c6f31d820f534427621d0e7c75ae9778e9e1388361859dd76bde29b5379c\": container with ID starting with 5078c6f31d820f534427621d0e7c75ae9778e9e1388361859dd76bde29b5379c not found: ID does not exist" Mar 18 16:48:37.939443 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939420 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:48:37.939709 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939698 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="kube-rbac-proxy" Mar 18 16:48:37.939753 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939711 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="kube-rbac-proxy" Mar 18 16:48:37.939753 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939719 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="config-reloader" Mar 18 16:48:37.939753 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939724 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="config-reloader" Mar 18 16:48:37.939753 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939734 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="kube-rbac-proxy-web" Mar 18 16:48:37.939753 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939739 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="kube-rbac-proxy-web" Mar 18 16:48:37.939753 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939745 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="prom-label-proxy" Mar 18 16:48:37.939753 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939750 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="prom-label-proxy" Mar 18 16:48:37.939953 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939761 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="kube-rbac-proxy-metric" Mar 18 16:48:37.939953 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939766 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="kube-rbac-proxy-metric" Mar 18 16:48:37.939953 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939772 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="alertmanager" Mar 18 16:48:37.939953 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939777 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="alertmanager" Mar 18 16:48:37.939953 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939786 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="init-config-reloader" Mar 18 16:48:37.939953 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939791 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="init-config-reloader" Mar 18 16:48:37.939953 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939834 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="config-reloader" Mar 18 16:48:37.939953 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939841 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="alertmanager" Mar 18 16:48:37.939953 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939849 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="kube-rbac-proxy" Mar 18 16:48:37.939953 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939856 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="kube-rbac-proxy-web" Mar 18 16:48:37.939953 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939864 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="kube-rbac-proxy-metric" Mar 18 16:48:37.939953 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.939870 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b122b28a-3083-417e-b44b-a28004869260" containerName="prom-label-proxy" Mar 18 16:48:37.944957 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.944940 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:37.948642 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.948620 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Mar 18 16:48:37.948755 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.948711 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Mar 18 16:48:37.948824 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.948757 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Mar 18 16:48:37.948824 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.948620 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Mar 18 16:48:37.948824 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.948770 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Mar 18 16:48:37.948959 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.948631 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Mar 18 16:48:37.948959 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.948903 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-4ltrm\"" Mar 18 16:48:37.949110 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.949041 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Mar 18 16:48:37.949627 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.949610 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Mar 18 16:48:37.952357 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.952341 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Mar 18 16:48:37.955462 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:37.955445 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:48:38.049740 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.049707 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-web-config\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.049740 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.049744 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/343eeaf2-6d7d-4ebc-9267-226d234372e5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.049955 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.049768 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.049955 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.049799 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/343eeaf2-6d7d-4ebc-9267-226d234372e5-config-out\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.049955 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.049835 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.049955 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.049859 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb9l6\" (UniqueName: \"kubernetes.io/projected/343eeaf2-6d7d-4ebc-9267-226d234372e5-kube-api-access-lb9l6\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.049955 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.049876 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/343eeaf2-6d7d-4ebc-9267-226d234372e5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.049955 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.049905 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-config-volume\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.050164 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.049957 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.050164 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.050018 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/343eeaf2-6d7d-4ebc-9267-226d234372e5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.050164 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.050040 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.050164 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.050066 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.050164 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.050116 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/343eeaf2-6d7d-4ebc-9267-226d234372e5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.125780 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.125750 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b122b28a-3083-417e-b44b-a28004869260" path="/var/lib/kubelet/pods/b122b28a-3083-417e-b44b-a28004869260/volumes" Mar 18 16:48:38.150472 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.150446 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-config-volume\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.150603 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.150477 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.150603 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.150500 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/343eeaf2-6d7d-4ebc-9267-226d234372e5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.150603 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.150517 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.150603 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.150543 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.150603 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.150574 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/343eeaf2-6d7d-4ebc-9267-226d234372e5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.150840 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.150607 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-web-config\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.150840 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.150640 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/343eeaf2-6d7d-4ebc-9267-226d234372e5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.150840 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.150669 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.150840 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.150691 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/343eeaf2-6d7d-4ebc-9267-226d234372e5-config-out\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.150840 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.150716 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.150840 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.150743 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lb9l6\" (UniqueName: \"kubernetes.io/projected/343eeaf2-6d7d-4ebc-9267-226d234372e5-kube-api-access-lb9l6\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.151167 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.151059 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/343eeaf2-6d7d-4ebc-9267-226d234372e5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.151222 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.151164 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/343eeaf2-6d7d-4ebc-9267-226d234372e5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.151765 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.151704 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/343eeaf2-6d7d-4ebc-9267-226d234372e5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.152466 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.152422 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/343eeaf2-6d7d-4ebc-9267-226d234372e5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.153671 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.153616 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.153766 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.153749 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/343eeaf2-6d7d-4ebc-9267-226d234372e5-config-out\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.154098 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.154051 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/343eeaf2-6d7d-4ebc-9267-226d234372e5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.154098 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.154052 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.154264 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.154247 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.154334 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.154313 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.154388 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.154374 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.154717 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.154697 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-config-volume\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.155277 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.155263 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/343eeaf2-6d7d-4ebc-9267-226d234372e5-web-config\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.159135 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.159115 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb9l6\" (UniqueName: \"kubernetes.io/projected/343eeaf2-6d7d-4ebc-9267-226d234372e5-kube-api-access-lb9l6\") pod \"alertmanager-main-0\" (UID: \"343eeaf2-6d7d-4ebc-9267-226d234372e5\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.257490 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.257408 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:48:38.382640 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.382615 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:48:38.385279 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:48:38.385252 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod343eeaf2_6d7d_4ebc_9267_226d234372e5.slice/crio-c075f4f566d867b7346b1ad85c0c3cff18c180d8dcd3942ead19739d844ad5d5 WatchSource:0}: Error finding container c075f4f566d867b7346b1ad85c0c3cff18c180d8dcd3942ead19739d844ad5d5: Status 404 returned error can't find the container with id c075f4f566d867b7346b1ad85c0c3cff18c180d8dcd3942ead19739d844ad5d5 Mar 18 16:48:38.879208 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.879175 2571 generic.go:358] "Generic (PLEG): container finished" podID="343eeaf2-6d7d-4ebc-9267-226d234372e5" containerID="98c2e055f6ac35c720445665bd5e43dcae993e0f073f46c0dc611200fcec3fe6" exitCode=0 Mar 18 16:48:38.879635 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.879215 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"343eeaf2-6d7d-4ebc-9267-226d234372e5","Type":"ContainerDied","Data":"98c2e055f6ac35c720445665bd5e43dcae993e0f073f46c0dc611200fcec3fe6"} Mar 18 16:48:38.879635 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:38.879250 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"343eeaf2-6d7d-4ebc-9267-226d234372e5","Type":"ContainerStarted","Data":"c075f4f566d867b7346b1ad85c0c3cff18c180d8dcd3942ead19739d844ad5d5"} Mar 18 16:48:39.885594 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:39.885561 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"343eeaf2-6d7d-4ebc-9267-226d234372e5","Type":"ContainerStarted","Data":"c68ecb4babf8fbe42b1afe086a9e0a032b6f55b8cf37a5353b2e67a690a8bba3"} Mar 18 16:48:39.885594 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:39.885598 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"343eeaf2-6d7d-4ebc-9267-226d234372e5","Type":"ContainerStarted","Data":"3cfe2f802fc3c10b40b20191cb3c18661131f98800c12d8fcda2d44e830c0cb1"} Mar 18 16:48:39.886030 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:39.885607 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"343eeaf2-6d7d-4ebc-9267-226d234372e5","Type":"ContainerStarted","Data":"a3897719befaad99f5e14cb7ee224ee7b4e92e2b326e9978085460ca95ab5674"} Mar 18 16:48:39.886030 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:39.885616 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"343eeaf2-6d7d-4ebc-9267-226d234372e5","Type":"ContainerStarted","Data":"ebd83a6c8fb894dbc232e7b5265abcc0c71b41f7c205d5c6fa214ca674dd92d1"} Mar 18 16:48:39.886030 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:39.885626 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"343eeaf2-6d7d-4ebc-9267-226d234372e5","Type":"ContainerStarted","Data":"8a3d774ace1cb7ea8df09e85d159656bf87b22f9161101f756e9d664f8156f09"} Mar 18 16:48:39.886030 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:39.885635 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"343eeaf2-6d7d-4ebc-9267-226d234372e5","Type":"ContainerStarted","Data":"6ab93c5df4cadfc29cccc5c631b585a81506dc35cba9552cd2b0feb7c06ec0cb"} Mar 18 16:48:39.912836 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:39.912777 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.912763015 podStartE2EDuration="2.912763015s" podCreationTimestamp="2026-03-18 16:48:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:48:39.912506916 +0000 UTC m=+260.442380130" watchObservedRunningTime="2026-03-18 16:48:39.912763015 +0000 UTC m=+260.442636197" Mar 18 16:48:40.641048 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.641011 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:48:40.641658 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.641622 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="kube-rbac-proxy" containerID="cri-o://58303a8e94d369291b38047d9dbd413671ee10a9a63f32606f0a87f9c8382b42" gracePeriod=600 Mar 18 16:48:40.641768 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.641660 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="thanos-sidecar" containerID="cri-o://c92ffb079258ebf23ae7ea4114b40b66479ed2a68842715de5472fa766960c88" gracePeriod=600 Mar 18 16:48:40.641835 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.641611 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="prometheus" containerID="cri-o://b2e20a6a7d766fe42c81e130b003b79a894fbdbf436d5b34661ef9548e1b13cd" gracePeriod=600 Mar 18 16:48:40.641835 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.641775 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="kube-rbac-proxy-thanos" containerID="cri-o://af54614e2cf5afb973074a22c1b60c9ca499c8cea68e7a8630a3d418343bf2d5" gracePeriod=600 Mar 18 16:48:40.641941 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.641776 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="kube-rbac-proxy-web" containerID="cri-o://e61bbeae6e6a5d174029f9f31bb096d25f67e518d521142814ad5f11b627f944" gracePeriod=600 Mar 18 16:48:40.641941 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.641887 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="config-reloader" containerID="cri-o://55d14751ad08d7ee9e7970bf1e0780716b60441cd21392c20d7794f332d4fb85" gracePeriod=600 Mar 18 16:48:40.893119 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.893023 2571 generic.go:358] "Generic (PLEG): container finished" podID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerID="af54614e2cf5afb973074a22c1b60c9ca499c8cea68e7a8630a3d418343bf2d5" exitCode=0 Mar 18 16:48:40.893119 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.893047 2571 generic.go:358] "Generic (PLEG): container finished" podID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerID="58303a8e94d369291b38047d9dbd413671ee10a9a63f32606f0a87f9c8382b42" exitCode=0 Mar 18 16:48:40.893119 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.893055 2571 generic.go:358] "Generic (PLEG): container finished" podID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerID="e61bbeae6e6a5d174029f9f31bb096d25f67e518d521142814ad5f11b627f944" exitCode=0 Mar 18 16:48:40.893119 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.893062 2571 generic.go:358] "Generic (PLEG): container finished" podID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerID="c92ffb079258ebf23ae7ea4114b40b66479ed2a68842715de5472fa766960c88" exitCode=0 Mar 18 16:48:40.893119 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.893067 2571 generic.go:358] "Generic (PLEG): container finished" podID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerID="55d14751ad08d7ee9e7970bf1e0780716b60441cd21392c20d7794f332d4fb85" exitCode=0 Mar 18 16:48:40.893119 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.893068 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10301ffa-6874-4472-b751-a9ed8dcc2293","Type":"ContainerDied","Data":"af54614e2cf5afb973074a22c1b60c9ca499c8cea68e7a8630a3d418343bf2d5"} Mar 18 16:48:40.893119 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.893115 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10301ffa-6874-4472-b751-a9ed8dcc2293","Type":"ContainerDied","Data":"58303a8e94d369291b38047d9dbd413671ee10a9a63f32606f0a87f9c8382b42"} Mar 18 16:48:40.893686 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.893131 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10301ffa-6874-4472-b751-a9ed8dcc2293","Type":"ContainerDied","Data":"e61bbeae6e6a5d174029f9f31bb096d25f67e518d521142814ad5f11b627f944"} Mar 18 16:48:40.893686 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.893144 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10301ffa-6874-4472-b751-a9ed8dcc2293","Type":"ContainerDied","Data":"c92ffb079258ebf23ae7ea4114b40b66479ed2a68842715de5472fa766960c88"} Mar 18 16:48:40.893686 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.893085 2571 generic.go:358] "Generic (PLEG): container finished" podID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerID="b2e20a6a7d766fe42c81e130b003b79a894fbdbf436d5b34661ef9548e1b13cd" exitCode=0 Mar 18 16:48:40.893686 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.893161 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10301ffa-6874-4472-b751-a9ed8dcc2293","Type":"ContainerDied","Data":"55d14751ad08d7ee9e7970bf1e0780716b60441cd21392c20d7794f332d4fb85"} Mar 18 16:48:40.893686 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.893260 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10301ffa-6874-4472-b751-a9ed8dcc2293","Type":"ContainerDied","Data":"b2e20a6a7d766fe42c81e130b003b79a894fbdbf436d5b34661ef9548e1b13cd"} Mar 18 16:48:40.893686 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.893281 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"10301ffa-6874-4472-b751-a9ed8dcc2293","Type":"ContainerDied","Data":"53cfd455b0beebc282bc12421d44e30a98f36e6a3a68607cd83c855a08d2698f"} Mar 18 16:48:40.893686 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.893295 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53cfd455b0beebc282bc12421d44e30a98f36e6a3a68607cd83c855a08d2698f" Mar 18 16:48:40.894016 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.893999 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:40.974760 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.974734 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-prometheus-k8s-tls\") pod \"10301ffa-6874-4472-b751-a9ed8dcc2293\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " Mar 18 16:48:40.974760 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.974770 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-configmap-serving-certs-ca-bundle\") pod \"10301ffa-6874-4472-b751-a9ed8dcc2293\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " Mar 18 16:48:40.975006 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.974794 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-config\") pod \"10301ffa-6874-4472-b751-a9ed8dcc2293\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " Mar 18 16:48:40.975006 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.974829 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-configmap-kubelet-serving-ca-bundle\") pod \"10301ffa-6874-4472-b751-a9ed8dcc2293\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " Mar 18 16:48:40.975006 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.974846 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-thanos-prometheus-http-client-file\") pod \"10301ffa-6874-4472-b751-a9ed8dcc2293\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " Mar 18 16:48:40.975006 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.974877 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/10301ffa-6874-4472-b751-a9ed8dcc2293-prometheus-k8s-db\") pod \"10301ffa-6874-4472-b751-a9ed8dcc2293\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " Mar 18 16:48:40.975006 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.974899 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-web-config\") pod \"10301ffa-6874-4472-b751-a9ed8dcc2293\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " Mar 18 16:48:40.975006 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.974932 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"10301ffa-6874-4472-b751-a9ed8dcc2293\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " Mar 18 16:48:40.975006 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.974955 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-kube-rbac-proxy\") pod \"10301ffa-6874-4472-b751-a9ed8dcc2293\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " Mar 18 16:48:40.975328 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.975016 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qwfg\" (UniqueName: \"kubernetes.io/projected/10301ffa-6874-4472-b751-a9ed8dcc2293-kube-api-access-8qwfg\") pod \"10301ffa-6874-4472-b751-a9ed8dcc2293\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " Mar 18 16:48:40.975328 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.975065 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10301ffa-6874-4472-b751-a9ed8dcc2293-config-out\") pod \"10301ffa-6874-4472-b751-a9ed8dcc2293\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " Mar 18 16:48:40.975328 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.975090 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-prometheus-k8s-rulefiles-0\") pod \"10301ffa-6874-4472-b751-a9ed8dcc2293\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " Mar 18 16:48:40.975328 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.975127 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-metrics-client-certs\") pod \"10301ffa-6874-4472-b751-a9ed8dcc2293\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " Mar 18 16:48:40.975328 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.975155 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"10301ffa-6874-4472-b751-a9ed8dcc2293\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " Mar 18 16:48:40.975328 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.975182 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-prometheus-trusted-ca-bundle\") pod \"10301ffa-6874-4472-b751-a9ed8dcc2293\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " Mar 18 16:48:40.975328 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.975181 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "10301ffa-6874-4472-b751-a9ed8dcc2293" (UID: "10301ffa-6874-4472-b751-a9ed8dcc2293"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:40.975328 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.975204 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "10301ffa-6874-4472-b751-a9ed8dcc2293" (UID: "10301ffa-6874-4472-b751-a9ed8dcc2293"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:40.975328 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.975208 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-grpc-tls\") pod \"10301ffa-6874-4472-b751-a9ed8dcc2293\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " Mar 18 16:48:40.975328 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.975266 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-configmap-metrics-client-ca\") pod \"10301ffa-6874-4472-b751-a9ed8dcc2293\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " Mar 18 16:48:40.975328 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.975323 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10301ffa-6874-4472-b751-a9ed8dcc2293-tls-assets\") pod \"10301ffa-6874-4472-b751-a9ed8dcc2293\" (UID: \"10301ffa-6874-4472-b751-a9ed8dcc2293\") " Mar 18 16:48:40.975836 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.975772 2571 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:40.975836 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.975801 2571 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:40.976353 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.976325 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "10301ffa-6874-4472-b751-a9ed8dcc2293" (UID: "10301ffa-6874-4472-b751-a9ed8dcc2293"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:40.976529 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.976501 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "10301ffa-6874-4472-b751-a9ed8dcc2293" (UID: "10301ffa-6874-4472-b751-a9ed8dcc2293"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:40.976945 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.976918 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "10301ffa-6874-4472-b751-a9ed8dcc2293" (UID: "10301ffa-6874-4472-b751-a9ed8dcc2293"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:40.978705 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.978412 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "10301ffa-6874-4472-b751-a9ed8dcc2293" (UID: "10301ffa-6874-4472-b751-a9ed8dcc2293"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:40.978705 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.978739 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10301ffa-6874-4472-b751-a9ed8dcc2293-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "10301ffa-6874-4472-b751-a9ed8dcc2293" (UID: "10301ffa-6874-4472-b751-a9ed8dcc2293"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:48:40.978705 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.978821 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-config" (OuterVolumeSpecName: "config") pod "10301ffa-6874-4472-b751-a9ed8dcc2293" (UID: "10301ffa-6874-4472-b751-a9ed8dcc2293"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:40.980170 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.979669 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "10301ffa-6874-4472-b751-a9ed8dcc2293" (UID: "10301ffa-6874-4472-b751-a9ed8dcc2293"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:40.980718 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.980695 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "10301ffa-6874-4472-b751-a9ed8dcc2293" (UID: "10301ffa-6874-4472-b751-a9ed8dcc2293"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:40.980940 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.980903 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "10301ffa-6874-4472-b751-a9ed8dcc2293" (UID: "10301ffa-6874-4472-b751-a9ed8dcc2293"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:40.981039 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.980940 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "10301ffa-6874-4472-b751-a9ed8dcc2293" (UID: "10301ffa-6874-4472-b751-a9ed8dcc2293"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:40.981039 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.980945 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10301ffa-6874-4472-b751-a9ed8dcc2293-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "10301ffa-6874-4472-b751-a9ed8dcc2293" (UID: "10301ffa-6874-4472-b751-a9ed8dcc2293"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:48:40.981179 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.981052 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "10301ffa-6874-4472-b751-a9ed8dcc2293" (UID: "10301ffa-6874-4472-b751-a9ed8dcc2293"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:40.981243 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.981216 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10301ffa-6874-4472-b751-a9ed8dcc2293-config-out" (OuterVolumeSpecName: "config-out") pod "10301ffa-6874-4472-b751-a9ed8dcc2293" (UID: "10301ffa-6874-4472-b751-a9ed8dcc2293"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:48:40.981493 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.981469 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "10301ffa-6874-4472-b751-a9ed8dcc2293" (UID: "10301ffa-6874-4472-b751-a9ed8dcc2293"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:40.981562 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.981492 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10301ffa-6874-4472-b751-a9ed8dcc2293-kube-api-access-8qwfg" (OuterVolumeSpecName: "kube-api-access-8qwfg") pod "10301ffa-6874-4472-b751-a9ed8dcc2293" (UID: "10301ffa-6874-4472-b751-a9ed8dcc2293"). InnerVolumeSpecName "kube-api-access-8qwfg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:48:40.992146 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:40.992120 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-web-config" (OuterVolumeSpecName: "web-config") pod "10301ffa-6874-4472-b751-a9ed8dcc2293" (UID: "10301ffa-6874-4472-b751-a9ed8dcc2293"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:41.076345 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.076300 2571 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/10301ffa-6874-4472-b751-a9ed8dcc2293-prometheus-k8s-db\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.076345 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.076339 2571 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-web-config\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.076345 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.076354 2571 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.076578 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.076364 2571 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-kube-rbac-proxy\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.076578 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.076374 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8qwfg\" (UniqueName: \"kubernetes.io/projected/10301ffa-6874-4472-b751-a9ed8dcc2293-kube-api-access-8qwfg\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.076578 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.076382 2571 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10301ffa-6874-4472-b751-a9ed8dcc2293-config-out\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.076578 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.076391 2571 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.076578 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.076400 2571 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-metrics-client-certs\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.076578 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.076410 2571 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.076578 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.076418 2571 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-prometheus-trusted-ca-bundle\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.076578 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.076428 2571 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-grpc-tls\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.076578 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.076436 2571 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10301ffa-6874-4472-b751-a9ed8dcc2293-configmap-metrics-client-ca\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.076578 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.076444 2571 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10301ffa-6874-4472-b751-a9ed8dcc2293-tls-assets\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.076578 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.076452 2571 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-secret-prometheus-k8s-tls\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.076578 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.076460 2571 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-config\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.076578 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.076469 2571 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/10301ffa-6874-4472-b751-a9ed8dcc2293-thanos-prometheus-http-client-file\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:48:41.896958 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.896927 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:41.919472 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.919447 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:48:41.925311 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.925283 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:48:41.952308 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952275 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:48:41.952712 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952696 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="kube-rbac-proxy-thanos" Mar 18 16:48:41.952763 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952715 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="kube-rbac-proxy-thanos" Mar 18 16:48:41.952763 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952734 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="prometheus" Mar 18 16:48:41.952763 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952743 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="prometheus" Mar 18 16:48:41.952763 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952754 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="kube-rbac-proxy-web" Mar 18 16:48:41.952890 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952763 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="kube-rbac-proxy-web" Mar 18 16:48:41.952890 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952780 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="init-config-reloader" Mar 18 16:48:41.952890 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952789 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="init-config-reloader" Mar 18 16:48:41.952890 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952800 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="thanos-sidecar" Mar 18 16:48:41.952890 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952808 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="thanos-sidecar" Mar 18 16:48:41.952890 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952821 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="config-reloader" Mar 18 16:48:41.952890 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952829 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="config-reloader" Mar 18 16:48:41.952890 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952837 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="kube-rbac-proxy" Mar 18 16:48:41.952890 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952845 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="kube-rbac-proxy" Mar 18 16:48:41.953183 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952900 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="config-reloader" Mar 18 16:48:41.953183 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952914 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="prometheus" Mar 18 16:48:41.953183 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952922 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="kube-rbac-proxy-web" Mar 18 16:48:41.953183 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952932 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="kube-rbac-proxy-thanos" Mar 18 16:48:41.953183 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952940 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="thanos-sidecar" Mar 18 16:48:41.953183 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.952952 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" containerName="kube-rbac-proxy" Mar 18 16:48:41.957874 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.957855 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:41.960003 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.959985 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Mar 18 16:48:41.960110 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.960091 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Mar 18 16:48:41.960278 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.960230 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-l45b4lsc4n1n\"" Mar 18 16:48:41.960278 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.960268 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Mar 18 16:48:41.960455 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.960296 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Mar 18 16:48:41.960455 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.960309 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Mar 18 16:48:41.960455 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.960377 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Mar 18 16:48:41.960798 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.960778 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Mar 18 16:48:41.960904 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.960810 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-x5tmq\"" Mar 18 16:48:41.960904 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.960828 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Mar 18 16:48:41.960904 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.960785 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Mar 18 16:48:41.961089 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.960793 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Mar 18 16:48:41.961089 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.960784 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Mar 18 16:48:41.963590 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.963571 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Mar 18 16:48:41.966955 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.966937 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Mar 18 16:48:41.969797 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:41.969777 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:48:42.084388 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.084299 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.084388 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.084338 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.084388 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.084356 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.084388 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.084375 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.084687 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.084422 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-config\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.084687 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.084437 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.084687 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.084506 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-web-config\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.084687 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.084542 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-config-out\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.084687 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.084567 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.084687 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.084610 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.084687 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.084628 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.084687 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.084643 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.084687 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.084658 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.085062 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.084695 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhjvg\" (UniqueName: \"kubernetes.io/projected/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-kube-api-access-xhjvg\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.085062 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.084726 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.085062 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.084754 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.085062 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.084792 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.085062 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.084828 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.125717 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.125686 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10301ffa-6874-4472-b751-a9ed8dcc2293" path="/var/lib/kubelet/pods/10301ffa-6874-4472-b751-a9ed8dcc2293/volumes" Mar 18 16:48:42.186020 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.185965 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.186207 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.186033 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.186207 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.186073 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.186207 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.186097 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.186364 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.186293 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.186364 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.186336 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.186364 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.186353 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.186500 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.186381 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-config\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.186500 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.186408 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.186500 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.186439 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-web-config\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.186500 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.186463 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-config-out\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.186500 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.186485 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.186743 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.186544 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.186743 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.186577 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.186743 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.186603 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.186743 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.186626 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.186743 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.186658 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhjvg\" (UniqueName: \"kubernetes.io/projected/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-kube-api-access-xhjvg\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.186743 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.186687 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.186743 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.186733 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.189453 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.189195 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.189453 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.189207 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.189453 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.189368 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-web-config\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.189453 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.189408 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-config\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.189453 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.189416 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.189932 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.189722 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.189932 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.189739 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.189932 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.189877 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.190633 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.190285 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.190633 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.190592 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.190771 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.190745 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.191483 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.191458 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-config-out\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.191711 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.191690 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.192148 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.192130 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.192315 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.192296 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.192742 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.192727 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.197703 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.197683 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhjvg\" (UniqueName: \"kubernetes.io/projected/c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b-kube-api-access-xhjvg\") pod \"prometheus-k8s-0\" (UID: \"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.268665 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.268628 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:42.422576 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.422539 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:48:42.430279 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:48:42.430252 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9f4dffc_a8b9_48e9_896e_9ff2abf4c12b.slice/crio-3016991fc003a98b2c1cac7009239ab38268c0be4e48e1e4ac59430f5795e440 WatchSource:0}: Error finding container 3016991fc003a98b2c1cac7009239ab38268c0be4e48e1e4ac59430f5795e440: Status 404 returned error can't find the container with id 3016991fc003a98b2c1cac7009239ab38268c0be4e48e1e4ac59430f5795e440 Mar 18 16:48:42.902034 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.902003 2571 generic.go:358] "Generic (PLEG): container finished" podID="c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b" containerID="c90688962390121e78a84e35f56e4401fca3af4a2eaf3fe11c0da447d1e73a50" exitCode=0 Mar 18 16:48:42.902412 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.902095 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b","Type":"ContainerDied","Data":"c90688962390121e78a84e35f56e4401fca3af4a2eaf3fe11c0da447d1e73a50"} Mar 18 16:48:42.902412 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:42.902131 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b","Type":"ContainerStarted","Data":"3016991fc003a98b2c1cac7009239ab38268c0be4e48e1e4ac59430f5795e440"} Mar 18 16:48:43.912770 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:43.912740 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b","Type":"ContainerStarted","Data":"f6a2c056644c34d9d3a9f1f37a314a041fad4d074e052115e4e7fd340ad84c43"} Mar 18 16:48:43.912770 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:43.912772 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b","Type":"ContainerStarted","Data":"152338c1755273a9c2cebd8d7a5daccf8874709d3e6cf883b07e109022b0edde"} Mar 18 16:48:43.912770 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:43.912782 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b","Type":"ContainerStarted","Data":"bcb5617878c50dbc4b38b4ae13f573262f3503f3a2a3b0839a05da673b0eb87e"} Mar 18 16:48:43.913253 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:43.912790 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b","Type":"ContainerStarted","Data":"3df5462f4e443d998a629778f901a997e5e650a3d91bdd1a548bdeae944ec7a4"} Mar 18 16:48:43.913253 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:43.912799 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b","Type":"ContainerStarted","Data":"ef62e5f904dc85ffeedf38273617d251c6a2feba14dcd10f14e63bdaeb297a0e"} Mar 18 16:48:43.913253 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:43.912807 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b","Type":"ContainerStarted","Data":"11fd796d5ae673535e216e41e29209b6b9ae36dd462c2b366b97de7626a4d23e"} Mar 18 16:48:43.943598 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:43.943543 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.943527042 podStartE2EDuration="2.943527042s" podCreationTimestamp="2026-03-18 16:48:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:48:43.941064468 +0000 UTC m=+264.470937649" watchObservedRunningTime="2026-03-18 16:48:43.943527042 +0000 UTC m=+264.473400223" Mar 18 16:48:47.269358 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:48:47.269331 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:19.997601 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:49:19.997580 2571 kubelet.go:1628] "Image garbage collection succeeded" Mar 18 16:49:42.269233 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:49:42.269197 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:42.284225 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:49:42.284203 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:49:43.108326 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:49:43.108302 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:50:40.331117 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:40.331081 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-x942k"] Mar 18 16:50:40.334024 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:40.334004 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x942k" Mar 18 16:50:40.336175 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:40.336153 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Mar 18 16:50:40.341016 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:40.340992 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x942k"] Mar 18 16:50:40.454735 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:40.454690 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9addceab-d971-4f5c-a01e-acfdb1254fab-dbus\") pod \"global-pull-secret-syncer-x942k\" (UID: \"9addceab-d971-4f5c-a01e-acfdb1254fab\") " pod="kube-system/global-pull-secret-syncer-x942k" Mar 18 16:50:40.454924 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:40.454753 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9addceab-d971-4f5c-a01e-acfdb1254fab-original-pull-secret\") pod \"global-pull-secret-syncer-x942k\" (UID: \"9addceab-d971-4f5c-a01e-acfdb1254fab\") " pod="kube-system/global-pull-secret-syncer-x942k" Mar 18 16:50:40.454924 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:40.454834 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9addceab-d971-4f5c-a01e-acfdb1254fab-kubelet-config\") pod \"global-pull-secret-syncer-x942k\" (UID: \"9addceab-d971-4f5c-a01e-acfdb1254fab\") " pod="kube-system/global-pull-secret-syncer-x942k" Mar 18 16:50:40.555536 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:40.555497 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9addceab-d971-4f5c-a01e-acfdb1254fab-kubelet-config\") pod \"global-pull-secret-syncer-x942k\" (UID: \"9addceab-d971-4f5c-a01e-acfdb1254fab\") " pod="kube-system/global-pull-secret-syncer-x942k" Mar 18 16:50:40.555715 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:40.555559 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9addceab-d971-4f5c-a01e-acfdb1254fab-dbus\") pod \"global-pull-secret-syncer-x942k\" (UID: \"9addceab-d971-4f5c-a01e-acfdb1254fab\") " pod="kube-system/global-pull-secret-syncer-x942k" Mar 18 16:50:40.555715 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:40.555598 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9addceab-d971-4f5c-a01e-acfdb1254fab-original-pull-secret\") pod \"global-pull-secret-syncer-x942k\" (UID: \"9addceab-d971-4f5c-a01e-acfdb1254fab\") " pod="kube-system/global-pull-secret-syncer-x942k" Mar 18 16:50:40.555715 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:40.555615 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9addceab-d971-4f5c-a01e-acfdb1254fab-kubelet-config\") pod \"global-pull-secret-syncer-x942k\" (UID: \"9addceab-d971-4f5c-a01e-acfdb1254fab\") " pod="kube-system/global-pull-secret-syncer-x942k" Mar 18 16:50:40.555826 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:40.555740 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9addceab-d971-4f5c-a01e-acfdb1254fab-dbus\") pod \"global-pull-secret-syncer-x942k\" (UID: \"9addceab-d971-4f5c-a01e-acfdb1254fab\") " pod="kube-system/global-pull-secret-syncer-x942k" Mar 18 16:50:40.557781 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:40.557765 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9addceab-d971-4f5c-a01e-acfdb1254fab-original-pull-secret\") pod \"global-pull-secret-syncer-x942k\" (UID: \"9addceab-d971-4f5c-a01e-acfdb1254fab\") " pod="kube-system/global-pull-secret-syncer-x942k" Mar 18 16:50:40.643046 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:40.642943 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x942k" Mar 18 16:50:40.758376 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:40.758349 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x942k"] Mar 18 16:50:40.760960 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:50:40.760933 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9addceab_d971_4f5c_a01e_acfdb1254fab.slice/crio-6c956c1531ea72b96466ffa281f7e1bfe3e15c852048d6700c8aa91d90457135 WatchSource:0}: Error finding container 6c956c1531ea72b96466ffa281f7e1bfe3e15c852048d6700c8aa91d90457135: Status 404 returned error can't find the container with id 6c956c1531ea72b96466ffa281f7e1bfe3e15c852048d6700c8aa91d90457135 Mar 18 16:50:40.762740 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:40.762727 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:50:41.267802 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:41.267770 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x942k" event={"ID":"9addceab-d971-4f5c-a01e-acfdb1254fab","Type":"ContainerStarted","Data":"6c956c1531ea72b96466ffa281f7e1bfe3e15c852048d6700c8aa91d90457135"} Mar 18 16:50:45.281483 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:45.281447 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x942k" event={"ID":"9addceab-d971-4f5c-a01e-acfdb1254fab","Type":"ContainerStarted","Data":"5f24e71ab9c8bb85154854f9c26edbbefa7d53e054f7f6211b38cc9b1caeb48e"} Mar 18 16:50:45.297299 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:50:45.297238 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-x942k" podStartSLOduration=1.8906356930000001 podStartE2EDuration="5.297218708s" podCreationTimestamp="2026-03-18 16:50:40 +0000 UTC" firstStartedPulling="2026-03-18 16:50:40.762848346 +0000 UTC m=+381.292721504" lastFinishedPulling="2026-03-18 16:50:44.169431357 +0000 UTC m=+384.699304519" observedRunningTime="2026-03-18 16:50:45.295922676 +0000 UTC m=+385.825795855" watchObservedRunningTime="2026-03-18 16:50:45.297218708 +0000 UTC m=+385.827091889" Mar 18 16:53:24.981734 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:24.981703 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-g8c44"] Mar 18 16:53:24.984190 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:24.984175 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-g8c44" Mar 18 16:53:24.986442 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:24.986422 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Mar 18 16:53:24.986569 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:24.986452 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-r6d2z\"" Mar 18 16:53:24.986569 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:24.986428 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Mar 18 16:53:24.987095 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:24.987080 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Mar 18 16:53:24.990994 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:24.990762 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-g8c44"] Mar 18 16:53:25.043961 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:25.043926 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkzf8\" (UniqueName: \"kubernetes.io/projected/882a0a1f-b512-4373-8265-28e77cc3c62a-kube-api-access-nkzf8\") pod \"s3-init-g8c44\" (UID: \"882a0a1f-b512-4373-8265-28e77cc3c62a\") " pod="kserve/s3-init-g8c44" Mar 18 16:53:25.145097 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:25.145061 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkzf8\" (UniqueName: \"kubernetes.io/projected/882a0a1f-b512-4373-8265-28e77cc3c62a-kube-api-access-nkzf8\") pod \"s3-init-g8c44\" (UID: \"882a0a1f-b512-4373-8265-28e77cc3c62a\") " pod="kserve/s3-init-g8c44" Mar 18 16:53:25.153688 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:25.153659 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkzf8\" (UniqueName: \"kubernetes.io/projected/882a0a1f-b512-4373-8265-28e77cc3c62a-kube-api-access-nkzf8\") pod \"s3-init-g8c44\" (UID: \"882a0a1f-b512-4373-8265-28e77cc3c62a\") " pod="kserve/s3-init-g8c44" Mar 18 16:53:25.306914 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:25.306819 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-g8c44" Mar 18 16:53:25.423271 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:25.423246 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-g8c44"] Mar 18 16:53:25.425785 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:53:25.425757 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod882a0a1f_b512_4373_8265_28e77cc3c62a.slice/crio-f9c724db9f0b0fa79682b1f3bcb1de44f612036bea85a1371525a427bd53c7cb WatchSource:0}: Error finding container f9c724db9f0b0fa79682b1f3bcb1de44f612036bea85a1371525a427bd53c7cb: Status 404 returned error can't find the container with id f9c724db9f0b0fa79682b1f3bcb1de44f612036bea85a1371525a427bd53c7cb Mar 18 16:53:25.763912 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:25.763877 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-g8c44" event={"ID":"882a0a1f-b512-4373-8265-28e77cc3c62a","Type":"ContainerStarted","Data":"f9c724db9f0b0fa79682b1f3bcb1de44f612036bea85a1371525a427bd53c7cb"} Mar 18 16:53:30.780642 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:30.780603 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-g8c44" event={"ID":"882a0a1f-b512-4373-8265-28e77cc3c62a","Type":"ContainerStarted","Data":"2cae7d31580311193994e5b0e6a21bef446214bf6ff43e380dc81b4bc82a0f6e"} Mar 18 16:53:30.799094 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:30.799038 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-g8c44" podStartSLOduration=2.461683421 podStartE2EDuration="6.799022509s" podCreationTimestamp="2026-03-18 16:53:24 +0000 UTC" firstStartedPulling="2026-03-18 16:53:25.427418685 +0000 UTC m=+545.957291845" lastFinishedPulling="2026-03-18 16:53:29.76475777 +0000 UTC m=+550.294630933" observedRunningTime="2026-03-18 16:53:30.798097492 +0000 UTC m=+551.327970674" watchObservedRunningTime="2026-03-18 16:53:30.799022509 +0000 UTC m=+551.328895688" Mar 18 16:53:33.796207 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:33.796172 2571 generic.go:358] "Generic (PLEG): container finished" podID="882a0a1f-b512-4373-8265-28e77cc3c62a" containerID="2cae7d31580311193994e5b0e6a21bef446214bf6ff43e380dc81b4bc82a0f6e" exitCode=0 Mar 18 16:53:33.796620 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:33.796249 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-g8c44" event={"ID":"882a0a1f-b512-4373-8265-28e77cc3c62a","Type":"ContainerDied","Data":"2cae7d31580311193994e5b0e6a21bef446214bf6ff43e380dc81b4bc82a0f6e"} Mar 18 16:53:34.918419 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:34.918397 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-g8c44" Mar 18 16:53:35.034629 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:35.034598 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkzf8\" (UniqueName: \"kubernetes.io/projected/882a0a1f-b512-4373-8265-28e77cc3c62a-kube-api-access-nkzf8\") pod \"882a0a1f-b512-4373-8265-28e77cc3c62a\" (UID: \"882a0a1f-b512-4373-8265-28e77cc3c62a\") " Mar 18 16:53:35.036797 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:35.036774 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/882a0a1f-b512-4373-8265-28e77cc3c62a-kube-api-access-nkzf8" (OuterVolumeSpecName: "kube-api-access-nkzf8") pod "882a0a1f-b512-4373-8265-28e77cc3c62a" (UID: "882a0a1f-b512-4373-8265-28e77cc3c62a"). InnerVolumeSpecName "kube-api-access-nkzf8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:53:35.135177 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:35.135104 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nkzf8\" (UniqueName: \"kubernetes.io/projected/882a0a1f-b512-4373-8265-28e77cc3c62a-kube-api-access-nkzf8\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:53:35.803790 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:35.803758 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-g8c44" event={"ID":"882a0a1f-b512-4373-8265-28e77cc3c62a","Type":"ContainerDied","Data":"f9c724db9f0b0fa79682b1f3bcb1de44f612036bea85a1371525a427bd53c7cb"} Mar 18 16:53:35.803790 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:35.803789 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9c724db9f0b0fa79682b1f3bcb1de44f612036bea85a1371525a427bd53c7cb" Mar 18 16:53:35.804025 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:35.803800 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-g8c44" Mar 18 16:53:44.051986 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.051938 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn"] Mar 18 16:53:44.052356 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.052282 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="882a0a1f-b512-4373-8265-28e77cc3c62a" containerName="s3-init" Mar 18 16:53:44.052356 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.052294 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="882a0a1f-b512-4373-8265-28e77cc3c62a" containerName="s3-init" Mar 18 16:53:44.052356 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.052346 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="882a0a1f-b512-4373-8265-28e77cc3c62a" containerName="s3-init" Mar 18 16:53:44.054829 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.054811 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" Mar 18 16:53:44.056953 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.056935 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Mar 18 16:53:44.057081 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.056955 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Mar 18 16:53:44.057081 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.056966 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lns69\"" Mar 18 16:53:44.064604 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.064578 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn"] Mar 18 16:53:44.215146 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.215112 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbg4d\" (UniqueName: \"kubernetes.io/projected/ebffde87-c362-47f0-b8dd-9798b092a274-kube-api-access-sbg4d\") pod \"error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn\" (UID: \"ebffde87-c362-47f0-b8dd-9798b092a274\") " pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" Mar 18 16:53:44.265255 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.265222 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8"] Mar 18 16:53:44.268204 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.268190 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" Mar 18 16:53:44.276427 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.276401 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8"] Mar 18 16:53:44.316511 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.316435 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbg4d\" (UniqueName: \"kubernetes.io/projected/ebffde87-c362-47f0-b8dd-9798b092a274-kube-api-access-sbg4d\") pod \"error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn\" (UID: \"ebffde87-c362-47f0-b8dd-9798b092a274\") " pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" Mar 18 16:53:44.323853 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.323826 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbg4d\" (UniqueName: \"kubernetes.io/projected/ebffde87-c362-47f0-b8dd-9798b092a274-kube-api-access-sbg4d\") pod \"error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn\" (UID: \"ebffde87-c362-47f0-b8dd-9798b092a274\") " pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" Mar 18 16:53:44.365885 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.365859 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" Mar 18 16:53:44.417000 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.416951 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4s77\" (UniqueName: \"kubernetes.io/projected/44daecb7-0fa2-4687-9d98-422f0cc3a248-kube-api-access-z4s77\") pod \"isvc-xgboost-graph-predictor-5c67486796-rk6k8\" (UID: \"44daecb7-0fa2-4687-9d98-422f0cc3a248\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" Mar 18 16:53:44.417203 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.417174 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44daecb7-0fa2-4687-9d98-422f0cc3a248-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-5c67486796-rk6k8\" (UID: \"44daecb7-0fa2-4687-9d98-422f0cc3a248\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" Mar 18 16:53:44.481344 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.481294 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn"] Mar 18 16:53:44.483702 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:53:44.483670 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebffde87_c362_47f0_b8dd_9798b092a274.slice/crio-0342f8b55d02fc60c6f5720b4cb2447fd9929354f372f87c47fe6da9746d8aa6 WatchSource:0}: Error finding container 0342f8b55d02fc60c6f5720b4cb2447fd9929354f372f87c47fe6da9746d8aa6: Status 404 returned error can't find the container with id 0342f8b55d02fc60c6f5720b4cb2447fd9929354f372f87c47fe6da9746d8aa6 Mar 18 16:53:44.517735 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.517698 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4s77\" (UniqueName: \"kubernetes.io/projected/44daecb7-0fa2-4687-9d98-422f0cc3a248-kube-api-access-z4s77\") pod \"isvc-xgboost-graph-predictor-5c67486796-rk6k8\" (UID: \"44daecb7-0fa2-4687-9d98-422f0cc3a248\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" Mar 18 16:53:44.517883 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.517791 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44daecb7-0fa2-4687-9d98-422f0cc3a248-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-5c67486796-rk6k8\" (UID: \"44daecb7-0fa2-4687-9d98-422f0cc3a248\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" Mar 18 16:53:44.518219 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.518195 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44daecb7-0fa2-4687-9d98-422f0cc3a248-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-5c67486796-rk6k8\" (UID: \"44daecb7-0fa2-4687-9d98-422f0cc3a248\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" Mar 18 16:53:44.525534 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.525512 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4s77\" (UniqueName: \"kubernetes.io/projected/44daecb7-0fa2-4687-9d98-422f0cc3a248-kube-api-access-z4s77\") pod \"isvc-xgboost-graph-predictor-5c67486796-rk6k8\" (UID: \"44daecb7-0fa2-4687-9d98-422f0cc3a248\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" Mar 18 16:53:44.580110 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.580039 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" Mar 18 16:53:44.707753 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.707654 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8"] Mar 18 16:53:44.710042 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:53:44.710002 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44daecb7_0fa2_4687_9d98_422f0cc3a248.slice/crio-d53454130c76ffc4311637d93dcfce5cb1464ce928f02ec1d3a1c40f486fe8a2 WatchSource:0}: Error finding container d53454130c76ffc4311637d93dcfce5cb1464ce928f02ec1d3a1c40f486fe8a2: Status 404 returned error can't find the container with id d53454130c76ffc4311637d93dcfce5cb1464ce928f02ec1d3a1c40f486fe8a2 Mar 18 16:53:44.832806 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.832719 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" event={"ID":"44daecb7-0fa2-4687-9d98-422f0cc3a248","Type":"ContainerStarted","Data":"d53454130c76ffc4311637d93dcfce5cb1464ce928f02ec1d3a1c40f486fe8a2"} Mar 18 16:53:44.833784 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:44.833762 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" event={"ID":"ebffde87-c362-47f0-b8dd-9798b092a274","Type":"ContainerStarted","Data":"0342f8b55d02fc60c6f5720b4cb2447fd9929354f372f87c47fe6da9746d8aa6"} Mar 18 16:53:51.859677 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:51.859642 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" event={"ID":"44daecb7-0fa2-4687-9d98-422f0cc3a248","Type":"ContainerStarted","Data":"43433e3c7d4649118a2b020b76ac3fed22ad66f1f57d54e042c94e0730913d97"} Mar 18 16:53:51.861016 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:51.860990 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" event={"ID":"ebffde87-c362-47f0-b8dd-9798b092a274","Type":"ContainerStarted","Data":"c184554547f5fa477d982a464f5609cda6db7f9c37ec972e6580623863c34629"} Mar 18 16:53:51.861225 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:51.861201 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" Mar 18 16:53:51.862582 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:51.862536 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" podUID="ebffde87-c362-47f0-b8dd-9798b092a274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Mar 18 16:53:51.889138 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:51.889093 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" podStartSLOduration=1.452638065 podStartE2EDuration="7.88907949s" podCreationTimestamp="2026-03-18 16:53:44 +0000 UTC" firstStartedPulling="2026-03-18 16:53:44.485371712 +0000 UTC m=+565.015244871" lastFinishedPulling="2026-03-18 16:53:50.921813134 +0000 UTC m=+571.451686296" observedRunningTime="2026-03-18 16:53:51.887374342 +0000 UTC m=+572.417247522" watchObservedRunningTime="2026-03-18 16:53:51.88907949 +0000 UTC m=+572.418952671" Mar 18 16:53:52.864560 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:52.864524 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" podUID="ebffde87-c362-47f0-b8dd-9798b092a274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Mar 18 16:53:54.871732 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:54.871702 2571 generic.go:358] "Generic (PLEG): container finished" podID="44daecb7-0fa2-4687-9d98-422f0cc3a248" containerID="43433e3c7d4649118a2b020b76ac3fed22ad66f1f57d54e042c94e0730913d97" exitCode=0 Mar 18 16:53:54.872099 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:53:54.871783 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" event={"ID":"44daecb7-0fa2-4687-9d98-422f0cc3a248","Type":"ContainerDied","Data":"43433e3c7d4649118a2b020b76ac3fed22ad66f1f57d54e042c94e0730913d97"} Mar 18 16:54:02.865485 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:02.865438 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" podUID="ebffde87-c362-47f0-b8dd-9798b092a274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Mar 18 16:54:12.864870 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:12.864822 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" podUID="ebffde87-c362-47f0-b8dd-9798b092a274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Mar 18 16:54:12.936408 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:12.936375 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" event={"ID":"44daecb7-0fa2-4687-9d98-422f0cc3a248","Type":"ContainerStarted","Data":"b7d0e0f5335ed88de8470ee8a70daaf9cfd03f0aa79ccdeb6124e49ce323fe27"} Mar 18 16:54:12.936735 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:12.936710 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" Mar 18 16:54:12.937905 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:12.937881 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" podUID="44daecb7-0fa2-4687-9d98-422f0cc3a248" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Mar 18 16:54:12.952394 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:12.952349 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" podStartSLOduration=0.89744087 podStartE2EDuration="28.9523363s" podCreationTimestamp="2026-03-18 16:53:44 +0000 UTC" firstStartedPulling="2026-03-18 16:53:44.712058868 +0000 UTC m=+565.241932027" lastFinishedPulling="2026-03-18 16:54:12.766954299 +0000 UTC m=+593.296827457" observedRunningTime="2026-03-18 16:54:12.951179261 +0000 UTC m=+593.481052442" watchObservedRunningTime="2026-03-18 16:54:12.9523363 +0000 UTC m=+593.482209480" Mar 18 16:54:13.939604 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:13.939564 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" podUID="44daecb7-0fa2-4687-9d98-422f0cc3a248" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Mar 18 16:54:20.021529 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:20.021496 2571 scope.go:117] "RemoveContainer" containerID="e61bbeae6e6a5d174029f9f31bb096d25f67e518d521142814ad5f11b627f944" Mar 18 16:54:20.029152 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:20.029134 2571 scope.go:117] "RemoveContainer" containerID="55d14751ad08d7ee9e7970bf1e0780716b60441cd21392c20d7794f332d4fb85" Mar 18 16:54:20.036097 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:20.036030 2571 scope.go:117] "RemoveContainer" containerID="ddd17028fe4e8136c8a3913e4ee100ad628f389b29a5b9307a76aa03bfdab6fb" Mar 18 16:54:20.043506 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:20.043490 2571 scope.go:117] "RemoveContainer" containerID="af54614e2cf5afb973074a22c1b60c9ca499c8cea68e7a8630a3d418343bf2d5" Mar 18 16:54:20.050536 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:20.050441 2571 scope.go:117] "RemoveContainer" containerID="b2e20a6a7d766fe42c81e130b003b79a894fbdbf436d5b34661ef9548e1b13cd" Mar 18 16:54:20.057604 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:20.057588 2571 scope.go:117] "RemoveContainer" containerID="58303a8e94d369291b38047d9dbd413671ee10a9a63f32606f0a87f9c8382b42" Mar 18 16:54:20.064396 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:20.064380 2571 scope.go:117] "RemoveContainer" containerID="c92ffb079258ebf23ae7ea4114b40b66479ed2a68842715de5472fa766960c88" Mar 18 16:54:22.864747 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:22.864699 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" podUID="ebffde87-c362-47f0-b8dd-9798b092a274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Mar 18 16:54:23.940330 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:23.940288 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" podUID="44daecb7-0fa2-4687-9d98-422f0cc3a248" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Mar 18 16:54:32.865008 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:32.864905 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" podUID="ebffde87-c362-47f0-b8dd-9798b092a274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Mar 18 16:54:33.940468 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:33.940423 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" podUID="44daecb7-0fa2-4687-9d98-422f0cc3a248" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Mar 18 16:54:42.865387 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:42.865342 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" podUID="ebffde87-c362-47f0-b8dd-9798b092a274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Mar 18 16:54:43.939853 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:43.939811 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" podUID="44daecb7-0fa2-4687-9d98-422f0cc3a248" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Mar 18 16:54:52.866118 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:52.866084 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" Mar 18 16:54:53.940537 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:54:53.940488 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" podUID="44daecb7-0fa2-4687-9d98-422f0cc3a248" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Mar 18 16:55:03.940179 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:03.940136 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" podUID="44daecb7-0fa2-4687-9d98-422f0cc3a248" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Mar 18 16:55:13.941416 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:13.941382 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" Mar 18 16:55:24.250195 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:24.250158 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k"] Mar 18 16:55:24.253415 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:24.253392 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" Mar 18 16:55:24.263597 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:24.263572 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k"] Mar 18 16:55:24.273164 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:24.273139 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn"] Mar 18 16:55:24.273430 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:24.273392 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" podUID="ebffde87-c362-47f0-b8dd-9798b092a274" containerName="kserve-container" containerID="cri-o://c184554547f5fa477d982a464f5609cda6db7f9c37ec972e6580623863c34629" gracePeriod=30 Mar 18 16:55:24.275441 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:24.275413 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8pqp\" (UniqueName: \"kubernetes.io/projected/68676a2c-17bc-4b64-81d4-d23de1c32dd1-kube-api-access-l8pqp\") pod \"error-404-isvc-dfe46-predictor-d698cb6db-b6k9k\" (UID: \"68676a2c-17bc-4b64-81d4-d23de1c32dd1\") " pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" Mar 18 16:55:24.376313 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:24.376276 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8pqp\" (UniqueName: \"kubernetes.io/projected/68676a2c-17bc-4b64-81d4-d23de1c32dd1-kube-api-access-l8pqp\") pod \"error-404-isvc-dfe46-predictor-d698cb6db-b6k9k\" (UID: \"68676a2c-17bc-4b64-81d4-d23de1c32dd1\") " pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" Mar 18 16:55:24.383738 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:24.383711 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8pqp\" (UniqueName: \"kubernetes.io/projected/68676a2c-17bc-4b64-81d4-d23de1c32dd1-kube-api-access-l8pqp\") pod \"error-404-isvc-dfe46-predictor-d698cb6db-b6k9k\" (UID: \"68676a2c-17bc-4b64-81d4-d23de1c32dd1\") " pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" Mar 18 16:55:24.566535 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:24.566454 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" Mar 18 16:55:24.679643 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:24.679616 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k"] Mar 18 16:55:24.682153 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:55:24.682125 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68676a2c_17bc_4b64_81d4_d23de1c32dd1.slice/crio-6fb8925be0e14e6be51b2725c6cf65d52978af38f3e0791b5078bd7911442e52 WatchSource:0}: Error finding container 6fb8925be0e14e6be51b2725c6cf65d52978af38f3e0791b5078bd7911442e52: Status 404 returned error can't find the container with id 6fb8925be0e14e6be51b2725c6cf65d52978af38f3e0791b5078bd7911442e52 Mar 18 16:55:25.148175 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:25.148135 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" event={"ID":"68676a2c-17bc-4b64-81d4-d23de1c32dd1","Type":"ContainerStarted","Data":"bcd34311ccbf68dabb16d480f489e0e527945a124a2d7309ac49c95388bdaf08"} Mar 18 16:55:25.148175 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:25.148176 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" event={"ID":"68676a2c-17bc-4b64-81d4-d23de1c32dd1","Type":"ContainerStarted","Data":"6fb8925be0e14e6be51b2725c6cf65d52978af38f3e0791b5078bd7911442e52"} Mar 18 16:55:25.148383 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:25.148350 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" Mar 18 16:55:25.149753 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:25.149725 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" podUID="68676a2c-17bc-4b64-81d4-d23de1c32dd1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Mar 18 16:55:25.163359 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:25.163319 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" podStartSLOduration=1.163306613 podStartE2EDuration="1.163306613s" podCreationTimestamp="2026-03-18 16:55:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:55:25.161153276 +0000 UTC m=+665.691026457" watchObservedRunningTime="2026-03-18 16:55:25.163306613 +0000 UTC m=+665.693179793" Mar 18 16:55:26.151432 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:26.151396 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" podUID="68676a2c-17bc-4b64-81d4-d23de1c32dd1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Mar 18 16:55:31.165842 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:31.165810 2571 generic.go:358] "Generic (PLEG): container finished" podID="ebffde87-c362-47f0-b8dd-9798b092a274" containerID="c184554547f5fa477d982a464f5609cda6db7f9c37ec972e6580623863c34629" exitCode=0 Mar 18 16:55:31.166201 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:31.165877 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" event={"ID":"ebffde87-c362-47f0-b8dd-9798b092a274","Type":"ContainerDied","Data":"c184554547f5fa477d982a464f5609cda6db7f9c37ec972e6580623863c34629"} Mar 18 16:55:31.212923 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:31.212900 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" Mar 18 16:55:31.232550 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:31.232521 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbg4d\" (UniqueName: \"kubernetes.io/projected/ebffde87-c362-47f0-b8dd-9798b092a274-kube-api-access-sbg4d\") pod \"ebffde87-c362-47f0-b8dd-9798b092a274\" (UID: \"ebffde87-c362-47f0-b8dd-9798b092a274\") " Mar 18 16:55:31.234646 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:31.234619 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebffde87-c362-47f0-b8dd-9798b092a274-kube-api-access-sbg4d" (OuterVolumeSpecName: "kube-api-access-sbg4d") pod "ebffde87-c362-47f0-b8dd-9798b092a274" (UID: "ebffde87-c362-47f0-b8dd-9798b092a274"). InnerVolumeSpecName "kube-api-access-sbg4d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:55:31.338800 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:31.334690 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sbg4d\" (UniqueName: \"kubernetes.io/projected/ebffde87-c362-47f0-b8dd-9798b092a274-kube-api-access-sbg4d\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:55:32.170191 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:32.170153 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" event={"ID":"ebffde87-c362-47f0-b8dd-9798b092a274","Type":"ContainerDied","Data":"0342f8b55d02fc60c6f5720b4cb2447fd9929354f372f87c47fe6da9746d8aa6"} Mar 18 16:55:32.170191 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:32.170180 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn" Mar 18 16:55:32.170653 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:32.170211 2571 scope.go:117] "RemoveContainer" containerID="c184554547f5fa477d982a464f5609cda6db7f9c37ec972e6580623863c34629" Mar 18 16:55:32.184631 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:32.184600 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn"] Mar 18 16:55:32.186108 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:32.186085 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a225b-predictor-6f4b648d5b-kq7qn"] Mar 18 16:55:34.124982 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:34.124949 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebffde87-c362-47f0-b8dd-9798b092a274" path="/var/lib/kubelet/pods/ebffde87-c362-47f0-b8dd-9798b092a274/volumes" Mar 18 16:55:36.152159 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:36.152125 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" podUID="68676a2c-17bc-4b64-81d4-d23de1c32dd1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Mar 18 16:55:46.152163 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:46.152122 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" podUID="68676a2c-17bc-4b64-81d4-d23de1c32dd1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Mar 18 16:55:56.151801 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:55:56.151755 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" podUID="68676a2c-17bc-4b64-81d4-d23de1c32dd1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Mar 18 16:56:03.970053 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:03.969949 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8"] Mar 18 16:56:03.970514 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:03.970337 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" podUID="44daecb7-0fa2-4687-9d98-422f0cc3a248" containerName="kserve-container" containerID="cri-o://b7d0e0f5335ed88de8470ee8a70daaf9cfd03f0aa79ccdeb6124e49ce323fe27" gracePeriod=30 Mar 18 16:56:04.370126 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:04.370043 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp"] Mar 18 16:56:04.370363 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:04.370351 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebffde87-c362-47f0-b8dd-9798b092a274" containerName="kserve-container" Mar 18 16:56:04.370409 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:04.370364 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebffde87-c362-47f0-b8dd-9798b092a274" containerName="kserve-container" Mar 18 16:56:04.370444 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:04.370431 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebffde87-c362-47f0-b8dd-9798b092a274" containerName="kserve-container" Mar 18 16:56:04.372535 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:04.372519 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" Mar 18 16:56:04.381124 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:04.381097 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp"] Mar 18 16:56:04.521117 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:04.521075 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwk7g\" (UniqueName: \"kubernetes.io/projected/42e0ab27-c561-4c38-b5be-e004773cde91-kube-api-access-rwk7g\") pod \"error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp\" (UID: \"42e0ab27-c561-4c38-b5be-e004773cde91\") " pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" Mar 18 16:56:04.622139 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:04.622044 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwk7g\" (UniqueName: \"kubernetes.io/projected/42e0ab27-c561-4c38-b5be-e004773cde91-kube-api-access-rwk7g\") pod \"error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp\" (UID: \"42e0ab27-c561-4c38-b5be-e004773cde91\") " pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" Mar 18 16:56:04.629964 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:04.629935 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwk7g\" (UniqueName: \"kubernetes.io/projected/42e0ab27-c561-4c38-b5be-e004773cde91-kube-api-access-rwk7g\") pod \"error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp\" (UID: \"42e0ab27-c561-4c38-b5be-e004773cde91\") " pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" Mar 18 16:56:04.683294 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:04.683257 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" Mar 18 16:56:04.802687 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:04.802660 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp"] Mar 18 16:56:04.804932 ip-10-0-139-43 kubenswrapper[2571]: W0318 16:56:04.804906 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42e0ab27_c561_4c38_b5be_e004773cde91.slice/crio-23ccf4d4e10e4f3500d1499cd54c85cdff66b70df8debf0bb55e79372bc3c478 WatchSource:0}: Error finding container 23ccf4d4e10e4f3500d1499cd54c85cdff66b70df8debf0bb55e79372bc3c478: Status 404 returned error can't find the container with id 23ccf4d4e10e4f3500d1499cd54c85cdff66b70df8debf0bb55e79372bc3c478 Mar 18 16:56:04.806733 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:04.806716 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:56:05.267761 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:05.267675 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" event={"ID":"42e0ab27-c561-4c38-b5be-e004773cde91","Type":"ContainerStarted","Data":"46eea1a984861a4027ecd52248f694d454a054c9a8c529df06f25482ff12f6bc"} Mar 18 16:56:05.267761 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:05.267713 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" event={"ID":"42e0ab27-c561-4c38-b5be-e004773cde91","Type":"ContainerStarted","Data":"23ccf4d4e10e4f3500d1499cd54c85cdff66b70df8debf0bb55e79372bc3c478"} Mar 18 16:56:05.267761 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:05.267733 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" Mar 18 16:56:05.269263 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:05.269234 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" podUID="42e0ab27-c561-4c38-b5be-e004773cde91" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Mar 18 16:56:05.283023 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:05.282968 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" podStartSLOduration=1.282954159 podStartE2EDuration="1.282954159s" podCreationTimestamp="2026-03-18 16:56:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:56:05.281300385 +0000 UTC m=+705.811173565" watchObservedRunningTime="2026-03-18 16:56:05.282954159 +0000 UTC m=+705.812827343" Mar 18 16:56:06.151714 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:06.151669 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" podUID="68676a2c-17bc-4b64-81d4-d23de1c32dd1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Mar 18 16:56:06.271155 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:06.271114 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" podUID="42e0ab27-c561-4c38-b5be-e004773cde91" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Mar 18 16:56:07.814249 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:07.814225 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" Mar 18 16:56:07.954924 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:07.954890 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4s77\" (UniqueName: \"kubernetes.io/projected/44daecb7-0fa2-4687-9d98-422f0cc3a248-kube-api-access-z4s77\") pod \"44daecb7-0fa2-4687-9d98-422f0cc3a248\" (UID: \"44daecb7-0fa2-4687-9d98-422f0cc3a248\") " Mar 18 16:56:07.955116 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:07.954954 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44daecb7-0fa2-4687-9d98-422f0cc3a248-kserve-provision-location\") pod \"44daecb7-0fa2-4687-9d98-422f0cc3a248\" (UID: \"44daecb7-0fa2-4687-9d98-422f0cc3a248\") " Mar 18 16:56:07.955285 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:07.955255 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44daecb7-0fa2-4687-9d98-422f0cc3a248-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "44daecb7-0fa2-4687-9d98-422f0cc3a248" (UID: "44daecb7-0fa2-4687-9d98-422f0cc3a248"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:56:07.956959 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:07.956938 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44daecb7-0fa2-4687-9d98-422f0cc3a248-kube-api-access-z4s77" (OuterVolumeSpecName: "kube-api-access-z4s77") pod "44daecb7-0fa2-4687-9d98-422f0cc3a248" (UID: "44daecb7-0fa2-4687-9d98-422f0cc3a248"). InnerVolumeSpecName "kube-api-access-z4s77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:56:08.055693 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:08.055651 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z4s77\" (UniqueName: \"kubernetes.io/projected/44daecb7-0fa2-4687-9d98-422f0cc3a248-kube-api-access-z4s77\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:56:08.055693 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:08.055683 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44daecb7-0fa2-4687-9d98-422f0cc3a248-kserve-provision-location\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 16:56:08.277567 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:08.277474 2571 generic.go:358] "Generic (PLEG): container finished" podID="44daecb7-0fa2-4687-9d98-422f0cc3a248" containerID="b7d0e0f5335ed88de8470ee8a70daaf9cfd03f0aa79ccdeb6124e49ce323fe27" exitCode=0 Mar 18 16:56:08.277704 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:08.277562 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" event={"ID":"44daecb7-0fa2-4687-9d98-422f0cc3a248","Type":"ContainerDied","Data":"b7d0e0f5335ed88de8470ee8a70daaf9cfd03f0aa79ccdeb6124e49ce323fe27"} Mar 18 16:56:08.277704 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:08.277601 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" event={"ID":"44daecb7-0fa2-4687-9d98-422f0cc3a248","Type":"ContainerDied","Data":"d53454130c76ffc4311637d93dcfce5cb1464ce928f02ec1d3a1c40f486fe8a2"} Mar 18 16:56:08.277704 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:08.277616 2571 scope.go:117] "RemoveContainer" containerID="b7d0e0f5335ed88de8470ee8a70daaf9cfd03f0aa79ccdeb6124e49ce323fe27" Mar 18 16:56:08.277704 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:08.277571 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8" Mar 18 16:56:08.285902 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:08.285883 2571 scope.go:117] "RemoveContainer" containerID="43433e3c7d4649118a2b020b76ac3fed22ad66f1f57d54e042c94e0730913d97" Mar 18 16:56:08.294815 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:08.294787 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8"] Mar 18 16:56:08.296905 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:08.296331 2571 scope.go:117] "RemoveContainer" containerID="b7d0e0f5335ed88de8470ee8a70daaf9cfd03f0aa79ccdeb6124e49ce323fe27" Mar 18 16:56:08.296905 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:56:08.296612 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7d0e0f5335ed88de8470ee8a70daaf9cfd03f0aa79ccdeb6124e49ce323fe27\": container with ID starting with b7d0e0f5335ed88de8470ee8a70daaf9cfd03f0aa79ccdeb6124e49ce323fe27 not found: ID does not exist" containerID="b7d0e0f5335ed88de8470ee8a70daaf9cfd03f0aa79ccdeb6124e49ce323fe27" Mar 18 16:56:08.296905 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:08.296651 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d0e0f5335ed88de8470ee8a70daaf9cfd03f0aa79ccdeb6124e49ce323fe27"} err="failed to get container status \"b7d0e0f5335ed88de8470ee8a70daaf9cfd03f0aa79ccdeb6124e49ce323fe27\": rpc error: code = NotFound desc = could not find container \"b7d0e0f5335ed88de8470ee8a70daaf9cfd03f0aa79ccdeb6124e49ce323fe27\": container with ID starting with b7d0e0f5335ed88de8470ee8a70daaf9cfd03f0aa79ccdeb6124e49ce323fe27 not found: ID does not exist" Mar 18 16:56:08.296905 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:08.296677 2571 scope.go:117] "RemoveContainer" containerID="43433e3c7d4649118a2b020b76ac3fed22ad66f1f57d54e042c94e0730913d97" Mar 18 16:56:08.297172 ip-10-0-139-43 kubenswrapper[2571]: E0318 16:56:08.296961 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43433e3c7d4649118a2b020b76ac3fed22ad66f1f57d54e042c94e0730913d97\": container with ID starting with 43433e3c7d4649118a2b020b76ac3fed22ad66f1f57d54e042c94e0730913d97 not found: ID does not exist" containerID="43433e3c7d4649118a2b020b76ac3fed22ad66f1f57d54e042c94e0730913d97" Mar 18 16:56:08.297172 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:08.297010 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43433e3c7d4649118a2b020b76ac3fed22ad66f1f57d54e042c94e0730913d97"} err="failed to get container status \"43433e3c7d4649118a2b020b76ac3fed22ad66f1f57d54e042c94e0730913d97\": rpc error: code = NotFound desc = could not find container \"43433e3c7d4649118a2b020b76ac3fed22ad66f1f57d54e042c94e0730913d97\": container with ID starting with 43433e3c7d4649118a2b020b76ac3fed22ad66f1f57d54e042c94e0730913d97 not found: ID does not exist" Mar 18 16:56:08.297819 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:08.297797 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-5c67486796-rk6k8"] Mar 18 16:56:10.125844 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:10.125809 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44daecb7-0fa2-4687-9d98-422f0cc3a248" path="/var/lib/kubelet/pods/44daecb7-0fa2-4687-9d98-422f0cc3a248/volumes" Mar 18 16:56:16.152001 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:16.151935 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" podUID="68676a2c-17bc-4b64-81d4-d23de1c32dd1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Mar 18 16:56:16.272225 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:16.272182 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" podUID="42e0ab27-c561-4c38-b5be-e004773cde91" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Mar 18 16:56:26.153191 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:26.153162 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" Mar 18 16:56:26.271717 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:26.271674 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" podUID="42e0ab27-c561-4c38-b5be-e004773cde91" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Mar 18 16:56:36.271455 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:36.271411 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" podUID="42e0ab27-c561-4c38-b5be-e004773cde91" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Mar 18 16:56:46.272130 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:46.272083 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" podUID="42e0ab27-c561-4c38-b5be-e004773cde91" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Mar 18 16:56:56.271533 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:56:56.271486 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" podUID="42e0ab27-c561-4c38-b5be-e004773cde91" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Mar 18 16:57:06.272142 ip-10-0-139-43 kubenswrapper[2571]: I0318 16:57:06.272108 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" Mar 18 17:04:59.447766 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:04:59.447676 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8"] Mar 18 17:04:59.450081 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:04:59.448024 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44daecb7-0fa2-4687-9d98-422f0cc3a248" containerName="storage-initializer" Mar 18 17:04:59.450081 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:04:59.448037 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="44daecb7-0fa2-4687-9d98-422f0cc3a248" containerName="storage-initializer" Mar 18 17:04:59.450081 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:04:59.448047 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44daecb7-0fa2-4687-9d98-422f0cc3a248" containerName="kserve-container" Mar 18 17:04:59.450081 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:04:59.448052 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="44daecb7-0fa2-4687-9d98-422f0cc3a248" containerName="kserve-container" Mar 18 17:04:59.450081 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:04:59.448101 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="44daecb7-0fa2-4687-9d98-422f0cc3a248" containerName="kserve-container" Mar 18 17:04:59.450791 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:04:59.450775 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" Mar 18 17:04:59.461210 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:04:59.461178 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8"] Mar 18 17:04:59.477264 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:04:59.477229 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k"] Mar 18 17:04:59.477535 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:04:59.477510 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" podUID="68676a2c-17bc-4b64-81d4-d23de1c32dd1" containerName="kserve-container" containerID="cri-o://bcd34311ccbf68dabb16d480f489e0e527945a124a2d7309ac49c95388bdaf08" gracePeriod=30 Mar 18 17:04:59.554358 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:04:59.554318 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnljc\" (UniqueName: \"kubernetes.io/projected/57cebae5-c6d8-48fe-896f-6bf4e093c294-kube-api-access-xnljc\") pod \"error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8\" (UID: \"57cebae5-c6d8-48fe-896f-6bf4e093c294\") " pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" Mar 18 17:04:59.654963 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:04:59.654919 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnljc\" (UniqueName: \"kubernetes.io/projected/57cebae5-c6d8-48fe-896f-6bf4e093c294-kube-api-access-xnljc\") pod \"error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8\" (UID: \"57cebae5-c6d8-48fe-896f-6bf4e093c294\") " pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" Mar 18 17:04:59.662646 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:04:59.662612 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnljc\" (UniqueName: \"kubernetes.io/projected/57cebae5-c6d8-48fe-896f-6bf4e093c294-kube-api-access-xnljc\") pod \"error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8\" (UID: \"57cebae5-c6d8-48fe-896f-6bf4e093c294\") " pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" Mar 18 17:04:59.763500 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:04:59.763409 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" Mar 18 17:04:59.884445 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:04:59.884380 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8"] Mar 18 17:04:59.887319 ip-10-0-139-43 kubenswrapper[2571]: W0318 17:04:59.887286 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57cebae5_c6d8_48fe_896f_6bf4e093c294.slice/crio-c0becb4ec6214dc2000d78311f4a57fd44267124e65f4d3af4c1561388c5089c WatchSource:0}: Error finding container c0becb4ec6214dc2000d78311f4a57fd44267124e65f4d3af4c1561388c5089c: Status 404 returned error can't find the container with id c0becb4ec6214dc2000d78311f4a57fd44267124e65f4d3af4c1561388c5089c Mar 18 17:04:59.889063 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:04:59.889047 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:05:00.857062 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:00.857029 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" event={"ID":"57cebae5-c6d8-48fe-896f-6bf4e093c294","Type":"ContainerStarted","Data":"d326dff4b6616a8482cc5d0a2f738e014560eab7547699619c8057218322edaf"} Mar 18 17:05:00.857062 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:00.857068 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" event={"ID":"57cebae5-c6d8-48fe-896f-6bf4e093c294","Type":"ContainerStarted","Data":"c0becb4ec6214dc2000d78311f4a57fd44267124e65f4d3af4c1561388c5089c"} Mar 18 17:05:00.857545 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:00.857203 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" Mar 18 17:05:00.858656 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:00.858627 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" podUID="57cebae5-c6d8-48fe-896f-6bf4e093c294" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Mar 18 17:05:00.872661 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:00.872613 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" podStartSLOduration=1.872600724 podStartE2EDuration="1.872600724s" podCreationTimestamp="2026-03-18 17:04:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:05:00.871427962 +0000 UTC m=+1241.401301143" watchObservedRunningTime="2026-03-18 17:05:00.872600724 +0000 UTC m=+1241.402473962" Mar 18 17:05:01.860303 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:01.860241 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" podUID="57cebae5-c6d8-48fe-896f-6bf4e093c294" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Mar 18 17:05:02.820249 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:02.820224 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" Mar 18 17:05:02.863770 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:02.863681 2571 generic.go:358] "Generic (PLEG): container finished" podID="68676a2c-17bc-4b64-81d4-d23de1c32dd1" containerID="bcd34311ccbf68dabb16d480f489e0e527945a124a2d7309ac49c95388bdaf08" exitCode=0 Mar 18 17:05:02.863770 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:02.863738 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" event={"ID":"68676a2c-17bc-4b64-81d4-d23de1c32dd1","Type":"ContainerDied","Data":"bcd34311ccbf68dabb16d480f489e0e527945a124a2d7309ac49c95388bdaf08"} Mar 18 17:05:02.863770 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:02.863744 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" Mar 18 17:05:02.864281 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:02.863772 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k" event={"ID":"68676a2c-17bc-4b64-81d4-d23de1c32dd1","Type":"ContainerDied","Data":"6fb8925be0e14e6be51b2725c6cf65d52978af38f3e0791b5078bd7911442e52"} Mar 18 17:05:02.864281 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:02.863791 2571 scope.go:117] "RemoveContainer" containerID="bcd34311ccbf68dabb16d480f489e0e527945a124a2d7309ac49c95388bdaf08" Mar 18 17:05:02.871825 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:02.871806 2571 scope.go:117] "RemoveContainer" containerID="bcd34311ccbf68dabb16d480f489e0e527945a124a2d7309ac49c95388bdaf08" Mar 18 17:05:02.872083 ip-10-0-139-43 kubenswrapper[2571]: E0318 17:05:02.872063 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcd34311ccbf68dabb16d480f489e0e527945a124a2d7309ac49c95388bdaf08\": container with ID starting with bcd34311ccbf68dabb16d480f489e0e527945a124a2d7309ac49c95388bdaf08 not found: ID does not exist" containerID="bcd34311ccbf68dabb16d480f489e0e527945a124a2d7309ac49c95388bdaf08" Mar 18 17:05:02.872157 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:02.872095 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcd34311ccbf68dabb16d480f489e0e527945a124a2d7309ac49c95388bdaf08"} err="failed to get container status \"bcd34311ccbf68dabb16d480f489e0e527945a124a2d7309ac49c95388bdaf08\": rpc error: code = NotFound desc = could not find container \"bcd34311ccbf68dabb16d480f489e0e527945a124a2d7309ac49c95388bdaf08\": container with ID starting with bcd34311ccbf68dabb16d480f489e0e527945a124a2d7309ac49c95388bdaf08 not found: ID does not exist" Mar 18 17:05:02.882866 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:02.882849 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8pqp\" (UniqueName: \"kubernetes.io/projected/68676a2c-17bc-4b64-81d4-d23de1c32dd1-kube-api-access-l8pqp\") pod \"68676a2c-17bc-4b64-81d4-d23de1c32dd1\" (UID: \"68676a2c-17bc-4b64-81d4-d23de1c32dd1\") " Mar 18 17:05:02.884864 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:02.884835 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68676a2c-17bc-4b64-81d4-d23de1c32dd1-kube-api-access-l8pqp" (OuterVolumeSpecName: "kube-api-access-l8pqp") pod "68676a2c-17bc-4b64-81d4-d23de1c32dd1" (UID: "68676a2c-17bc-4b64-81d4-d23de1c32dd1"). InnerVolumeSpecName "kube-api-access-l8pqp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:05:02.983704 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:02.983644 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l8pqp\" (UniqueName: \"kubernetes.io/projected/68676a2c-17bc-4b64-81d4-d23de1c32dd1-kube-api-access-l8pqp\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 17:05:03.186166 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:03.186126 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k"] Mar 18 17:05:03.187459 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:03.187428 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-dfe46-predictor-d698cb6db-b6k9k"] Mar 18 17:05:04.126039 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:04.126007 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68676a2c-17bc-4b64-81d4-d23de1c32dd1" path="/var/lib/kubelet/pods/68676a2c-17bc-4b64-81d4-d23de1c32dd1/volumes" Mar 18 17:05:11.861075 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:11.861027 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" podUID="57cebae5-c6d8-48fe-896f-6bf4e093c294" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Mar 18 17:05:21.860440 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:21.860394 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" podUID="57cebae5-c6d8-48fe-896f-6bf4e093c294" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Mar 18 17:05:31.861147 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:31.861099 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" podUID="57cebae5-c6d8-48fe-896f-6bf4e093c294" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Mar 18 17:05:39.117742 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:39.117704 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp"] Mar 18 17:05:39.118605 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:39.118569 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" podUID="42e0ab27-c561-4c38-b5be-e004773cde91" containerName="kserve-container" containerID="cri-o://46eea1a984861a4027ecd52248f694d454a054c9a8c529df06f25482ff12f6bc" gracePeriod=30 Mar 18 17:05:39.153102 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:39.153067 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862"] Mar 18 17:05:39.153602 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:39.153587 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68676a2c-17bc-4b64-81d4-d23de1c32dd1" containerName="kserve-container" Mar 18 17:05:39.153649 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:39.153607 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="68676a2c-17bc-4b64-81d4-d23de1c32dd1" containerName="kserve-container" Mar 18 17:05:39.153732 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:39.153720 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="68676a2c-17bc-4b64-81d4-d23de1c32dd1" containerName="kserve-container" Mar 18 17:05:39.156807 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:39.156784 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" Mar 18 17:05:39.161907 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:39.161879 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862"] Mar 18 17:05:39.192963 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:39.192923 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt5v6\" (UniqueName: \"kubernetes.io/projected/a5935591-7466-4118-ad77-a4f0fb921c33-kube-api-access-kt5v6\") pod \"error-404-isvc-18dc8-predictor-646c887d76-n8862\" (UID: \"a5935591-7466-4118-ad77-a4f0fb921c33\") " pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" Mar 18 17:05:39.294290 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:39.294257 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kt5v6\" (UniqueName: \"kubernetes.io/projected/a5935591-7466-4118-ad77-a4f0fb921c33-kube-api-access-kt5v6\") pod \"error-404-isvc-18dc8-predictor-646c887d76-n8862\" (UID: \"a5935591-7466-4118-ad77-a4f0fb921c33\") " pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" Mar 18 17:05:39.302666 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:39.302636 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt5v6\" (UniqueName: \"kubernetes.io/projected/a5935591-7466-4118-ad77-a4f0fb921c33-kube-api-access-kt5v6\") pod \"error-404-isvc-18dc8-predictor-646c887d76-n8862\" (UID: \"a5935591-7466-4118-ad77-a4f0fb921c33\") " pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" Mar 18 17:05:39.469150 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:39.469083 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" Mar 18 17:05:39.588061 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:39.588024 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862"] Mar 18 17:05:39.591642 ip-10-0-139-43 kubenswrapper[2571]: W0318 17:05:39.591616 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5935591_7466_4118_ad77_a4f0fb921c33.slice/crio-1a9afb766f2e4ac35d6e3330c5cac003dff2e253a4afe7ad970181e50eee6b8f WatchSource:0}: Error finding container 1a9afb766f2e4ac35d6e3330c5cac003dff2e253a4afe7ad970181e50eee6b8f: Status 404 returned error can't find the container with id 1a9afb766f2e4ac35d6e3330c5cac003dff2e253a4afe7ad970181e50eee6b8f Mar 18 17:05:39.979018 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:39.978953 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" event={"ID":"a5935591-7466-4118-ad77-a4f0fb921c33","Type":"ContainerStarted","Data":"d58116c70f3d9d7386e8cc37ec3e369358c08bf1144fe343161967e135576dca"} Mar 18 17:05:39.979018 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:39.979022 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" Mar 18 17:05:39.979460 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:39.979034 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" event={"ID":"a5935591-7466-4118-ad77-a4f0fb921c33","Type":"ContainerStarted","Data":"1a9afb766f2e4ac35d6e3330c5cac003dff2e253a4afe7ad970181e50eee6b8f"} Mar 18 17:05:39.980461 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:39.980435 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" podUID="a5935591-7466-4118-ad77-a4f0fb921c33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Mar 18 17:05:39.995052 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:39.995005 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" podStartSLOduration=0.994967695 podStartE2EDuration="994.967695ms" podCreationTimestamp="2026-03-18 17:05:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:05:39.993795442 +0000 UTC m=+1280.523668624" watchObservedRunningTime="2026-03-18 17:05:39.994967695 +0000 UTC m=+1280.524840875" Mar 18 17:05:40.982071 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:40.982030 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" podUID="a5935591-7466-4118-ad77-a4f0fb921c33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Mar 18 17:05:41.860655 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:41.860610 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" podUID="57cebae5-c6d8-48fe-896f-6bf4e093c294" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Mar 18 17:05:42.368587 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:42.368564 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" Mar 18 17:05:42.421587 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:42.421556 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwk7g\" (UniqueName: \"kubernetes.io/projected/42e0ab27-c561-4c38-b5be-e004773cde91-kube-api-access-rwk7g\") pod \"42e0ab27-c561-4c38-b5be-e004773cde91\" (UID: \"42e0ab27-c561-4c38-b5be-e004773cde91\") " Mar 18 17:05:42.423688 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:42.423653 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e0ab27-c561-4c38-b5be-e004773cde91-kube-api-access-rwk7g" (OuterVolumeSpecName: "kube-api-access-rwk7g") pod "42e0ab27-c561-4c38-b5be-e004773cde91" (UID: "42e0ab27-c561-4c38-b5be-e004773cde91"). InnerVolumeSpecName "kube-api-access-rwk7g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:05:42.523004 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:42.522904 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rwk7g\" (UniqueName: \"kubernetes.io/projected/42e0ab27-c561-4c38-b5be-e004773cde91-kube-api-access-rwk7g\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 17:05:42.988559 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:42.988525 2571 generic.go:358] "Generic (PLEG): container finished" podID="42e0ab27-c561-4c38-b5be-e004773cde91" containerID="46eea1a984861a4027ecd52248f694d454a054c9a8c529df06f25482ff12f6bc" exitCode=0 Mar 18 17:05:42.988770 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:42.988584 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" Mar 18 17:05:42.988770 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:42.988609 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" event={"ID":"42e0ab27-c561-4c38-b5be-e004773cde91","Type":"ContainerDied","Data":"46eea1a984861a4027ecd52248f694d454a054c9a8c529df06f25482ff12f6bc"} Mar 18 17:05:42.988770 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:42.988651 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp" event={"ID":"42e0ab27-c561-4c38-b5be-e004773cde91","Type":"ContainerDied","Data":"23ccf4d4e10e4f3500d1499cd54c85cdff66b70df8debf0bb55e79372bc3c478"} Mar 18 17:05:42.988770 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:42.988672 2571 scope.go:117] "RemoveContainer" containerID="46eea1a984861a4027ecd52248f694d454a054c9a8c529df06f25482ff12f6bc" Mar 18 17:05:43.001535 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:43.001514 2571 scope.go:117] "RemoveContainer" containerID="46eea1a984861a4027ecd52248f694d454a054c9a8c529df06f25482ff12f6bc" Mar 18 17:05:43.001859 ip-10-0-139-43 kubenswrapper[2571]: E0318 17:05:43.001834 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46eea1a984861a4027ecd52248f694d454a054c9a8c529df06f25482ff12f6bc\": container with ID starting with 46eea1a984861a4027ecd52248f694d454a054c9a8c529df06f25482ff12f6bc not found: ID does not exist" containerID="46eea1a984861a4027ecd52248f694d454a054c9a8c529df06f25482ff12f6bc" Mar 18 17:05:43.001904 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:43.001875 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46eea1a984861a4027ecd52248f694d454a054c9a8c529df06f25482ff12f6bc"} err="failed to get container status \"46eea1a984861a4027ecd52248f694d454a054c9a8c529df06f25482ff12f6bc\": rpc error: code = NotFound desc = could not find container \"46eea1a984861a4027ecd52248f694d454a054c9a8c529df06f25482ff12f6bc\": container with ID starting with 46eea1a984861a4027ecd52248f694d454a054c9a8c529df06f25482ff12f6bc not found: ID does not exist" Mar 18 17:05:43.011174 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:43.011146 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp"] Mar 18 17:05:43.012836 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:43.012817 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9913d-predictor-5ccb69bd64-h6pgp"] Mar 18 17:05:44.131519 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:44.131486 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e0ab27-c561-4c38-b5be-e004773cde91" path="/var/lib/kubelet/pods/42e0ab27-c561-4c38-b5be-e004773cde91/volumes" Mar 18 17:05:50.982323 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:50.982285 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" podUID="a5935591-7466-4118-ad77-a4f0fb921c33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Mar 18 17:05:51.861138 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:05:51.861095 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" podUID="57cebae5-c6d8-48fe-896f-6bf4e093c294" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Mar 18 17:06:00.982885 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:00.982833 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" podUID="a5935591-7466-4118-ad77-a4f0fb921c33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Mar 18 17:06:01.861180 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:01.861145 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" Mar 18 17:06:10.982630 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:10.982589 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" podUID="a5935591-7466-4118-ad77-a4f0fb921c33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Mar 18 17:06:20.982273 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:20.982222 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" podUID="a5935591-7466-4118-ad77-a4f0fb921c33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Mar 18 17:06:29.492295 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:29.492213 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8"] Mar 18 17:06:29.492659 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:29.492547 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" podUID="57cebae5-c6d8-48fe-896f-6bf4e093c294" containerName="kserve-container" containerID="cri-o://d326dff4b6616a8482cc5d0a2f738e014560eab7547699619c8057218322edaf" gracePeriod=30 Mar 18 17:06:29.751993 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:29.751898 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t"] Mar 18 17:06:29.752242 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:29.752230 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42e0ab27-c561-4c38-b5be-e004773cde91" containerName="kserve-container" Mar 18 17:06:29.752285 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:29.752244 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e0ab27-c561-4c38-b5be-e004773cde91" containerName="kserve-container" Mar 18 17:06:29.752319 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:29.752309 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="42e0ab27-c561-4c38-b5be-e004773cde91" containerName="kserve-container" Mar 18 17:06:29.756318 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:29.756300 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" Mar 18 17:06:29.761725 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:29.761698 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t"] Mar 18 17:06:29.798900 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:29.798851 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrrbc\" (UniqueName: \"kubernetes.io/projected/f34256db-4192-491f-918c-db97261a9d3b-kube-api-access-hrrbc\") pod \"error-404-isvc-19f83-predictor-6d757459d6-rjs8t\" (UID: \"f34256db-4192-491f-918c-db97261a9d3b\") " pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" Mar 18 17:06:29.899540 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:29.899507 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrrbc\" (UniqueName: \"kubernetes.io/projected/f34256db-4192-491f-918c-db97261a9d3b-kube-api-access-hrrbc\") pod \"error-404-isvc-19f83-predictor-6d757459d6-rjs8t\" (UID: \"f34256db-4192-491f-918c-db97261a9d3b\") " pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" Mar 18 17:06:29.908118 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:29.908086 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrrbc\" (UniqueName: \"kubernetes.io/projected/f34256db-4192-491f-918c-db97261a9d3b-kube-api-access-hrrbc\") pod \"error-404-isvc-19f83-predictor-6d757459d6-rjs8t\" (UID: \"f34256db-4192-491f-918c-db97261a9d3b\") " pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" Mar 18 17:06:30.067406 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:30.067320 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" Mar 18 17:06:30.185001 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:30.184958 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t"] Mar 18 17:06:30.187560 ip-10-0-139-43 kubenswrapper[2571]: W0318 17:06:30.187531 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf34256db_4192_491f_918c_db97261a9d3b.slice/crio-65a47ead7843b86798ed50bd26de32769648b597ff7e4be607feb4eb98c90098 WatchSource:0}: Error finding container 65a47ead7843b86798ed50bd26de32769648b597ff7e4be607feb4eb98c90098: Status 404 returned error can't find the container with id 65a47ead7843b86798ed50bd26de32769648b597ff7e4be607feb4eb98c90098 Mar 18 17:06:30.982747 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:30.982707 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" podUID="a5935591-7466-4118-ad77-a4f0fb921c33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Mar 18 17:06:31.136634 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:31.136592 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" event={"ID":"f34256db-4192-491f-918c-db97261a9d3b","Type":"ContainerStarted","Data":"42d495051ff2153cdf60f7a3da74fc743f429104e2439b8e8bb6930190e0ebab"} Mar 18 17:06:31.136634 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:31.136629 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" event={"ID":"f34256db-4192-491f-918c-db97261a9d3b","Type":"ContainerStarted","Data":"65a47ead7843b86798ed50bd26de32769648b597ff7e4be607feb4eb98c90098"} Mar 18 17:06:31.136907 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:31.136768 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" Mar 18 17:06:31.137874 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:31.137847 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" podUID="f34256db-4192-491f-918c-db97261a9d3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Mar 18 17:06:31.151337 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:31.151295 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" podStartSLOduration=2.151280742 podStartE2EDuration="2.151280742s" podCreationTimestamp="2026-03-18 17:06:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:06:31.150672659 +0000 UTC m=+1331.680545852" watchObservedRunningTime="2026-03-18 17:06:31.151280742 +0000 UTC m=+1331.681153926" Mar 18 17:06:31.860770 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:31.860727 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" podUID="57cebae5-c6d8-48fe-896f-6bf4e093c294" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Mar 18 17:06:32.139984 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:32.139890 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" podUID="f34256db-4192-491f-918c-db97261a9d3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Mar 18 17:06:32.829735 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:32.829713 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" Mar 18 17:06:32.921659 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:32.921628 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnljc\" (UniqueName: \"kubernetes.io/projected/57cebae5-c6d8-48fe-896f-6bf4e093c294-kube-api-access-xnljc\") pod \"57cebae5-c6d8-48fe-896f-6bf4e093c294\" (UID: \"57cebae5-c6d8-48fe-896f-6bf4e093c294\") " Mar 18 17:06:32.923606 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:32.923581 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57cebae5-c6d8-48fe-896f-6bf4e093c294-kube-api-access-xnljc" (OuterVolumeSpecName: "kube-api-access-xnljc") pod "57cebae5-c6d8-48fe-896f-6bf4e093c294" (UID: "57cebae5-c6d8-48fe-896f-6bf4e093c294"). InnerVolumeSpecName "kube-api-access-xnljc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:06:33.022966 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:33.022935 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xnljc\" (UniqueName: \"kubernetes.io/projected/57cebae5-c6d8-48fe-896f-6bf4e093c294-kube-api-access-xnljc\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 17:06:33.147595 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:33.147556 2571 generic.go:358] "Generic (PLEG): container finished" podID="57cebae5-c6d8-48fe-896f-6bf4e093c294" containerID="d326dff4b6616a8482cc5d0a2f738e014560eab7547699619c8057218322edaf" exitCode=0 Mar 18 17:06:33.148137 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:33.147616 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" event={"ID":"57cebae5-c6d8-48fe-896f-6bf4e093c294","Type":"ContainerDied","Data":"d326dff4b6616a8482cc5d0a2f738e014560eab7547699619c8057218322edaf"} Mar 18 17:06:33.148137 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:33.147628 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" Mar 18 17:06:33.148137 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:33.147648 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8" event={"ID":"57cebae5-c6d8-48fe-896f-6bf4e093c294","Type":"ContainerDied","Data":"c0becb4ec6214dc2000d78311f4a57fd44267124e65f4d3af4c1561388c5089c"} Mar 18 17:06:33.148137 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:33.147663 2571 scope.go:117] "RemoveContainer" containerID="d326dff4b6616a8482cc5d0a2f738e014560eab7547699619c8057218322edaf" Mar 18 17:06:33.156708 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:33.156686 2571 scope.go:117] "RemoveContainer" containerID="d326dff4b6616a8482cc5d0a2f738e014560eab7547699619c8057218322edaf" Mar 18 17:06:33.156942 ip-10-0-139-43 kubenswrapper[2571]: E0318 17:06:33.156925 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d326dff4b6616a8482cc5d0a2f738e014560eab7547699619c8057218322edaf\": container with ID starting with d326dff4b6616a8482cc5d0a2f738e014560eab7547699619c8057218322edaf not found: ID does not exist" containerID="d326dff4b6616a8482cc5d0a2f738e014560eab7547699619c8057218322edaf" Mar 18 17:06:33.157033 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:33.156947 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d326dff4b6616a8482cc5d0a2f738e014560eab7547699619c8057218322edaf"} err="failed to get container status \"d326dff4b6616a8482cc5d0a2f738e014560eab7547699619c8057218322edaf\": rpc error: code = NotFound desc = could not find container \"d326dff4b6616a8482cc5d0a2f738e014560eab7547699619c8057218322edaf\": container with ID starting with d326dff4b6616a8482cc5d0a2f738e014560eab7547699619c8057218322edaf not found: ID does not exist" Mar 18 17:06:33.168048 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:33.168024 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8"] Mar 18 17:06:33.171200 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:33.171179 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-4ddda-predictor-6f79885fcd-tpnm8"] Mar 18 17:06:34.125593 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:34.125551 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57cebae5-c6d8-48fe-896f-6bf4e093c294" path="/var/lib/kubelet/pods/57cebae5-c6d8-48fe-896f-6bf4e093c294/volumes" Mar 18 17:06:40.984156 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:40.984124 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" Mar 18 17:06:42.140332 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:42.140289 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" podUID="f34256db-4192-491f-918c-db97261a9d3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Mar 18 17:06:52.140359 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:06:52.140316 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" podUID="f34256db-4192-491f-918c-db97261a9d3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Mar 18 17:07:02.140320 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:02.140270 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" podUID="f34256db-4192-491f-918c-db97261a9d3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Mar 18 17:07:09.460202 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:09.460165 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5"] Mar 18 17:07:09.460713 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:09.460698 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57cebae5-c6d8-48fe-896f-6bf4e093c294" containerName="kserve-container" Mar 18 17:07:09.460761 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:09.460718 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="57cebae5-c6d8-48fe-896f-6bf4e093c294" containerName="kserve-container" Mar 18 17:07:09.460807 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:09.460796 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="57cebae5-c6d8-48fe-896f-6bf4e093c294" containerName="kserve-container" Mar 18 17:07:09.463181 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:09.463163 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" Mar 18 17:07:09.469016 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:09.468991 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5"] Mar 18 17:07:09.475856 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:09.475833 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862"] Mar 18 17:07:09.476098 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:09.476078 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" podUID="a5935591-7466-4118-ad77-a4f0fb921c33" containerName="kserve-container" containerID="cri-o://d58116c70f3d9d7386e8cc37ec3e369358c08bf1144fe343161967e135576dca" gracePeriod=30 Mar 18 17:07:09.541953 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:09.541910 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnztt\" (UniqueName: \"kubernetes.io/projected/0af702de-e71b-4aec-9584-4868398f7bc0-kube-api-access-jnztt\") pod \"error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5\" (UID: \"0af702de-e71b-4aec-9584-4868398f7bc0\") " pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" Mar 18 17:07:09.642918 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:09.642881 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnztt\" (UniqueName: \"kubernetes.io/projected/0af702de-e71b-4aec-9584-4868398f7bc0-kube-api-access-jnztt\") pod \"error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5\" (UID: \"0af702de-e71b-4aec-9584-4868398f7bc0\") " pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" Mar 18 17:07:09.650212 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:09.650187 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnztt\" (UniqueName: \"kubernetes.io/projected/0af702de-e71b-4aec-9584-4868398f7bc0-kube-api-access-jnztt\") pod \"error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5\" (UID: \"0af702de-e71b-4aec-9584-4868398f7bc0\") " pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" Mar 18 17:07:09.776051 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:09.775940 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" Mar 18 17:07:09.894523 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:09.894500 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5"] Mar 18 17:07:09.897085 ip-10-0-139-43 kubenswrapper[2571]: W0318 17:07:09.897036 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0af702de_e71b_4aec_9584_4868398f7bc0.slice/crio-42beb4f59e63a6386a4c2dbaadeaf81dfb24947f1cc9319366699454da6d1cfc WatchSource:0}: Error finding container 42beb4f59e63a6386a4c2dbaadeaf81dfb24947f1cc9319366699454da6d1cfc: Status 404 returned error can't find the container with id 42beb4f59e63a6386a4c2dbaadeaf81dfb24947f1cc9319366699454da6d1cfc Mar 18 17:07:10.260874 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:10.260836 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" event={"ID":"0af702de-e71b-4aec-9584-4868398f7bc0","Type":"ContainerStarted","Data":"258c08383d96ccaf1473f51f6419755905df5bc78e4b8f2efe11bb4247af1f9e"} Mar 18 17:07:10.260874 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:10.260878 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" event={"ID":"0af702de-e71b-4aec-9584-4868398f7bc0","Type":"ContainerStarted","Data":"42beb4f59e63a6386a4c2dbaadeaf81dfb24947f1cc9319366699454da6d1cfc"} Mar 18 17:07:10.261178 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:10.260967 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" Mar 18 17:07:10.262428 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:10.262401 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" podUID="0af702de-e71b-4aec-9584-4868398f7bc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Mar 18 17:07:10.276391 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:10.276349 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" podStartSLOduration=1.276336677 podStartE2EDuration="1.276336677s" podCreationTimestamp="2026-03-18 17:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:07:10.274388316 +0000 UTC m=+1370.804261496" watchObservedRunningTime="2026-03-18 17:07:10.276336677 +0000 UTC m=+1370.806209835" Mar 18 17:07:10.982991 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:10.982937 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" podUID="a5935591-7466-4118-ad77-a4f0fb921c33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Mar 18 17:07:11.264487 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:11.264399 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" podUID="0af702de-e71b-4aec-9584-4868398f7bc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Mar 18 17:07:12.140885 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:12.140844 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" podUID="f34256db-4192-491f-918c-db97261a9d3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Mar 18 17:07:12.616783 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:12.616758 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" Mar 18 17:07:12.768160 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:12.768071 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt5v6\" (UniqueName: \"kubernetes.io/projected/a5935591-7466-4118-ad77-a4f0fb921c33-kube-api-access-kt5v6\") pod \"a5935591-7466-4118-ad77-a4f0fb921c33\" (UID: \"a5935591-7466-4118-ad77-a4f0fb921c33\") " Mar 18 17:07:12.770206 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:12.770171 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5935591-7466-4118-ad77-a4f0fb921c33-kube-api-access-kt5v6" (OuterVolumeSpecName: "kube-api-access-kt5v6") pod "a5935591-7466-4118-ad77-a4f0fb921c33" (UID: "a5935591-7466-4118-ad77-a4f0fb921c33"). InnerVolumeSpecName "kube-api-access-kt5v6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:07:12.868860 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:12.868822 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kt5v6\" (UniqueName: \"kubernetes.io/projected/a5935591-7466-4118-ad77-a4f0fb921c33-kube-api-access-kt5v6\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 17:07:13.271166 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:13.271134 2571 generic.go:358] "Generic (PLEG): container finished" podID="a5935591-7466-4118-ad77-a4f0fb921c33" containerID="d58116c70f3d9d7386e8cc37ec3e369358c08bf1144fe343161967e135576dca" exitCode=0 Mar 18 17:07:13.271628 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:13.271201 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" Mar 18 17:07:13.271628 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:13.271219 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" event={"ID":"a5935591-7466-4118-ad77-a4f0fb921c33","Type":"ContainerDied","Data":"d58116c70f3d9d7386e8cc37ec3e369358c08bf1144fe343161967e135576dca"} Mar 18 17:07:13.271628 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:13.271256 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862" event={"ID":"a5935591-7466-4118-ad77-a4f0fb921c33","Type":"ContainerDied","Data":"1a9afb766f2e4ac35d6e3330c5cac003dff2e253a4afe7ad970181e50eee6b8f"} Mar 18 17:07:13.271628 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:13.271272 2571 scope.go:117] "RemoveContainer" containerID="d58116c70f3d9d7386e8cc37ec3e369358c08bf1144fe343161967e135576dca" Mar 18 17:07:13.280020 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:13.279996 2571 scope.go:117] "RemoveContainer" containerID="d58116c70f3d9d7386e8cc37ec3e369358c08bf1144fe343161967e135576dca" Mar 18 17:07:13.280295 ip-10-0-139-43 kubenswrapper[2571]: E0318 17:07:13.280276 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d58116c70f3d9d7386e8cc37ec3e369358c08bf1144fe343161967e135576dca\": container with ID starting with d58116c70f3d9d7386e8cc37ec3e369358c08bf1144fe343161967e135576dca not found: ID does not exist" containerID="d58116c70f3d9d7386e8cc37ec3e369358c08bf1144fe343161967e135576dca" Mar 18 17:07:13.280377 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:13.280310 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d58116c70f3d9d7386e8cc37ec3e369358c08bf1144fe343161967e135576dca"} err="failed to get container status \"d58116c70f3d9d7386e8cc37ec3e369358c08bf1144fe343161967e135576dca\": rpc error: code = NotFound desc = could not find container \"d58116c70f3d9d7386e8cc37ec3e369358c08bf1144fe343161967e135576dca\": container with ID starting with d58116c70f3d9d7386e8cc37ec3e369358c08bf1144fe343161967e135576dca not found: ID does not exist" Mar 18 17:07:13.290896 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:13.290871 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862"] Mar 18 17:07:13.297269 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:13.297247 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-18dc8-predictor-646c887d76-n8862"] Mar 18 17:07:14.127826 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:14.127791 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5935591-7466-4118-ad77-a4f0fb921c33" path="/var/lib/kubelet/pods/a5935591-7466-4118-ad77-a4f0fb921c33/volumes" Mar 18 17:07:21.264556 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:21.264515 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" podUID="0af702de-e71b-4aec-9584-4868398f7bc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Mar 18 17:07:22.140015 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:22.139966 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" podUID="f34256db-4192-491f-918c-db97261a9d3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Mar 18 17:07:31.265227 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:31.265182 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" podUID="0af702de-e71b-4aec-9584-4868398f7bc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Mar 18 17:07:32.141316 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:32.141290 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" Mar 18 17:07:41.264493 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:41.264448 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" podUID="0af702de-e71b-4aec-9584-4868398f7bc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Mar 18 17:07:51.265009 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:07:51.264937 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" podUID="0af702de-e71b-4aec-9584-4868398f7bc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Mar 18 17:08:01.264832 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:08:01.264743 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" podUID="0af702de-e71b-4aec-9584-4868398f7bc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Mar 18 17:08:11.266187 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:08:11.266151 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" Mar 18 17:16:04.777320 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:04.777220 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t"] Mar 18 17:16:04.777867 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:04.777489 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" podUID="f34256db-4192-491f-918c-db97261a9d3b" containerName="kserve-container" containerID="cri-o://42d495051ff2153cdf60f7a3da74fc743f429104e2439b8e8bb6930190e0ebab" gracePeriod=30 Mar 18 17:16:04.855476 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:04.855446 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw"] Mar 18 17:16:04.855781 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:04.855770 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5935591-7466-4118-ad77-a4f0fb921c33" containerName="kserve-container" Mar 18 17:16:04.855829 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:04.855783 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5935591-7466-4118-ad77-a4f0fb921c33" containerName="kserve-container" Mar 18 17:16:04.855874 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:04.855841 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a5935591-7466-4118-ad77-a4f0fb921c33" containerName="kserve-container" Mar 18 17:16:04.858653 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:04.858634 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" Mar 18 17:16:04.867501 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:04.867474 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw"] Mar 18 17:16:04.908347 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:04.908290 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlc6f\" (UniqueName: \"kubernetes.io/projected/bcc10674-da0b-4889-8b6f-5f40931c86e3-kube-api-access-rlc6f\") pod \"error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw\" (UID: \"bcc10674-da0b-4889-8b6f-5f40931c86e3\") " pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" Mar 18 17:16:05.009047 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:05.008982 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlc6f\" (UniqueName: \"kubernetes.io/projected/bcc10674-da0b-4889-8b6f-5f40931c86e3-kube-api-access-rlc6f\") pod \"error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw\" (UID: \"bcc10674-da0b-4889-8b6f-5f40931c86e3\") " pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" Mar 18 17:16:05.017477 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:05.017452 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlc6f\" (UniqueName: \"kubernetes.io/projected/bcc10674-da0b-4889-8b6f-5f40931c86e3-kube-api-access-rlc6f\") pod \"error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw\" (UID: \"bcc10674-da0b-4889-8b6f-5f40931c86e3\") " pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" Mar 18 17:16:05.169427 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:05.169394 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" Mar 18 17:16:05.297302 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:05.297259 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw"] Mar 18 17:16:05.301116 ip-10-0-139-43 kubenswrapper[2571]: W0318 17:16:05.301085 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcc10674_da0b_4889_8b6f_5f40931c86e3.slice/crio-8b7338891425bae0af675b753f11a451210200619748e51bc82257aa421b62a0 WatchSource:0}: Error finding container 8b7338891425bae0af675b753f11a451210200619748e51bc82257aa421b62a0: Status 404 returned error can't find the container with id 8b7338891425bae0af675b753f11a451210200619748e51bc82257aa421b62a0 Mar 18 17:16:05.303367 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:05.303348 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:16:05.850506 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:05.850469 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" event={"ID":"bcc10674-da0b-4889-8b6f-5f40931c86e3","Type":"ContainerStarted","Data":"a09794bdc15c28c0f129d88f182f82670f8850b3fe952b872f3fd6c04d836f8e"} Mar 18 17:16:05.850506 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:05.850506 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" event={"ID":"bcc10674-da0b-4889-8b6f-5f40931c86e3","Type":"ContainerStarted","Data":"8b7338891425bae0af675b753f11a451210200619748e51bc82257aa421b62a0"} Mar 18 17:16:05.851008 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:05.850683 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" Mar 18 17:16:05.851915 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:05.851887 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" podUID="bcc10674-da0b-4889-8b6f-5f40931c86e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Mar 18 17:16:05.864850 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:05.864799 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" podStartSLOduration=1.8647822939999998 podStartE2EDuration="1.864782294s" podCreationTimestamp="2026-03-18 17:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:16:05.864078312 +0000 UTC m=+1906.393951493" watchObservedRunningTime="2026-03-18 17:16:05.864782294 +0000 UTC m=+1906.394655474" Mar 18 17:16:06.854218 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:06.854177 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" podUID="bcc10674-da0b-4889-8b6f-5f40931c86e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Mar 18 17:16:08.222718 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:08.222691 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" Mar 18 17:16:08.334963 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:08.334877 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrrbc\" (UniqueName: \"kubernetes.io/projected/f34256db-4192-491f-918c-db97261a9d3b-kube-api-access-hrrbc\") pod \"f34256db-4192-491f-918c-db97261a9d3b\" (UID: \"f34256db-4192-491f-918c-db97261a9d3b\") " Mar 18 17:16:08.337002 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:08.336938 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f34256db-4192-491f-918c-db97261a9d3b-kube-api-access-hrrbc" (OuterVolumeSpecName: "kube-api-access-hrrbc") pod "f34256db-4192-491f-918c-db97261a9d3b" (UID: "f34256db-4192-491f-918c-db97261a9d3b"). InnerVolumeSpecName "kube-api-access-hrrbc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:16:08.435420 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:08.435375 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hrrbc\" (UniqueName: \"kubernetes.io/projected/f34256db-4192-491f-918c-db97261a9d3b-kube-api-access-hrrbc\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 17:16:08.864652 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:08.864615 2571 generic.go:358] "Generic (PLEG): container finished" podID="f34256db-4192-491f-918c-db97261a9d3b" containerID="42d495051ff2153cdf60f7a3da74fc743f429104e2439b8e8bb6930190e0ebab" exitCode=0 Mar 18 17:16:08.864834 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:08.864680 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" Mar 18 17:16:08.864834 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:08.864700 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" event={"ID":"f34256db-4192-491f-918c-db97261a9d3b","Type":"ContainerDied","Data":"42d495051ff2153cdf60f7a3da74fc743f429104e2439b8e8bb6930190e0ebab"} Mar 18 17:16:08.864834 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:08.864741 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t" event={"ID":"f34256db-4192-491f-918c-db97261a9d3b","Type":"ContainerDied","Data":"65a47ead7843b86798ed50bd26de32769648b597ff7e4be607feb4eb98c90098"} Mar 18 17:16:08.864834 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:08.864758 2571 scope.go:117] "RemoveContainer" containerID="42d495051ff2153cdf60f7a3da74fc743f429104e2439b8e8bb6930190e0ebab" Mar 18 17:16:08.873924 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:08.873902 2571 scope.go:117] "RemoveContainer" containerID="42d495051ff2153cdf60f7a3da74fc743f429104e2439b8e8bb6930190e0ebab" Mar 18 17:16:08.874230 ip-10-0-139-43 kubenswrapper[2571]: E0318 17:16:08.874212 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42d495051ff2153cdf60f7a3da74fc743f429104e2439b8e8bb6930190e0ebab\": container with ID starting with 42d495051ff2153cdf60f7a3da74fc743f429104e2439b8e8bb6930190e0ebab not found: ID does not exist" containerID="42d495051ff2153cdf60f7a3da74fc743f429104e2439b8e8bb6930190e0ebab" Mar 18 17:16:08.874306 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:08.874243 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42d495051ff2153cdf60f7a3da74fc743f429104e2439b8e8bb6930190e0ebab"} err="failed to get container status \"42d495051ff2153cdf60f7a3da74fc743f429104e2439b8e8bb6930190e0ebab\": rpc error: code = NotFound desc = could not find container \"42d495051ff2153cdf60f7a3da74fc743f429104e2439b8e8bb6930190e0ebab\": container with ID starting with 42d495051ff2153cdf60f7a3da74fc743f429104e2439b8e8bb6930190e0ebab not found: ID does not exist" Mar 18 17:16:08.885035 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:08.885005 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t"] Mar 18 17:16:08.886349 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:08.886328 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-19f83-predictor-6d757459d6-rjs8t"] Mar 18 17:16:10.125542 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:10.125505 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f34256db-4192-491f-918c-db97261a9d3b" path="/var/lib/kubelet/pods/f34256db-4192-491f-918c-db97261a9d3b/volumes" Mar 18 17:16:16.854302 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:16.854254 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" podUID="bcc10674-da0b-4889-8b6f-5f40931c86e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Mar 18 17:16:26.854274 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:26.854220 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" podUID="bcc10674-da0b-4889-8b6f-5f40931c86e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Mar 18 17:16:36.854790 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:36.854741 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" podUID="bcc10674-da0b-4889-8b6f-5f40931c86e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Mar 18 17:16:44.168855 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:44.168802 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5"] Mar 18 17:16:44.169360 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:44.169113 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" podUID="0af702de-e71b-4aec-9584-4868398f7bc0" containerName="kserve-container" containerID="cri-o://258c08383d96ccaf1473f51f6419755905df5bc78e4b8f2efe11bb4247af1f9e" gracePeriod=30 Mar 18 17:16:44.406597 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:44.406557 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r"] Mar 18 17:16:44.406957 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:44.406941 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f34256db-4192-491f-918c-db97261a9d3b" containerName="kserve-container" Mar 18 17:16:44.407071 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:44.406960 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34256db-4192-491f-918c-db97261a9d3b" containerName="kserve-container" Mar 18 17:16:44.407124 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:44.407099 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f34256db-4192-491f-918c-db97261a9d3b" containerName="kserve-container" Mar 18 17:16:44.409798 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:44.409777 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" Mar 18 17:16:44.415871 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:44.415846 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r"] Mar 18 17:16:44.555040 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:44.554914 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxkjm\" (UniqueName: \"kubernetes.io/projected/7c139f50-4575-43f9-b8ee-3b00be7c1b44-kube-api-access-zxkjm\") pod \"error-404-isvc-eb922-predictor-6b476995dd-bh64r\" (UID: \"7c139f50-4575-43f9-b8ee-3b00be7c1b44\") " pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" Mar 18 17:16:44.656430 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:44.656393 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxkjm\" (UniqueName: \"kubernetes.io/projected/7c139f50-4575-43f9-b8ee-3b00be7c1b44-kube-api-access-zxkjm\") pod \"error-404-isvc-eb922-predictor-6b476995dd-bh64r\" (UID: \"7c139f50-4575-43f9-b8ee-3b00be7c1b44\") " pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" Mar 18 17:16:44.663550 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:44.663521 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxkjm\" (UniqueName: \"kubernetes.io/projected/7c139f50-4575-43f9-b8ee-3b00be7c1b44-kube-api-access-zxkjm\") pod \"error-404-isvc-eb922-predictor-6b476995dd-bh64r\" (UID: \"7c139f50-4575-43f9-b8ee-3b00be7c1b44\") " pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" Mar 18 17:16:44.720606 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:44.720575 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" Mar 18 17:16:44.840638 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:44.840590 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r"] Mar 18 17:16:44.842812 ip-10-0-139-43 kubenswrapper[2571]: W0318 17:16:44.842776 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c139f50_4575_43f9_b8ee_3b00be7c1b44.slice/crio-918b4b5bb4b9926f9b63ecd08192dd6cb3affe6ea6fbc703a96170ea1354e592 WatchSource:0}: Error finding container 918b4b5bb4b9926f9b63ecd08192dd6cb3affe6ea6fbc703a96170ea1354e592: Status 404 returned error can't find the container with id 918b4b5bb4b9926f9b63ecd08192dd6cb3affe6ea6fbc703a96170ea1354e592 Mar 18 17:16:44.977967 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:44.977924 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" event={"ID":"7c139f50-4575-43f9-b8ee-3b00be7c1b44","Type":"ContainerStarted","Data":"69001f036334db45d8d6a8ab50eafc15dbb15bebb7983346a372215674ba12bf"} Mar 18 17:16:44.977967 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:44.977961 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" event={"ID":"7c139f50-4575-43f9-b8ee-3b00be7c1b44","Type":"ContainerStarted","Data":"918b4b5bb4b9926f9b63ecd08192dd6cb3affe6ea6fbc703a96170ea1354e592"} Mar 18 17:16:44.978240 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:44.978220 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" Mar 18 17:16:44.979454 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:44.979428 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" podUID="7c139f50-4575-43f9-b8ee-3b00be7c1b44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Mar 18 17:16:44.992332 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:44.992292 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" podStartSLOduration=0.992280188 podStartE2EDuration="992.280188ms" podCreationTimestamp="2026-03-18 17:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:16:44.99157321 +0000 UTC m=+1945.521446392" watchObservedRunningTime="2026-03-18 17:16:44.992280188 +0000 UTC m=+1945.522153419" Mar 18 17:16:45.981430 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:45.981393 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" podUID="7c139f50-4575-43f9-b8ee-3b00be7c1b44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Mar 18 17:16:46.855203 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:46.855154 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" podUID="bcc10674-da0b-4889-8b6f-5f40931c86e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Mar 18 17:16:51.265180 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:51.265134 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" podUID="0af702de-e71b-4aec-9584-4868398f7bc0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Mar 18 17:16:51.605132 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:51.605108 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" Mar 18 17:16:51.614623 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:51.614597 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnztt\" (UniqueName: \"kubernetes.io/projected/0af702de-e71b-4aec-9584-4868398f7bc0-kube-api-access-jnztt\") pod \"0af702de-e71b-4aec-9584-4868398f7bc0\" (UID: \"0af702de-e71b-4aec-9584-4868398f7bc0\") " Mar 18 17:16:51.616588 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:51.616563 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0af702de-e71b-4aec-9584-4868398f7bc0-kube-api-access-jnztt" (OuterVolumeSpecName: "kube-api-access-jnztt") pod "0af702de-e71b-4aec-9584-4868398f7bc0" (UID: "0af702de-e71b-4aec-9584-4868398f7bc0"). InnerVolumeSpecName "kube-api-access-jnztt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:16:51.715263 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:51.715224 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jnztt\" (UniqueName: \"kubernetes.io/projected/0af702de-e71b-4aec-9584-4868398f7bc0-kube-api-access-jnztt\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 17:16:52.000172 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:52.000135 2571 generic.go:358] "Generic (PLEG): container finished" podID="0af702de-e71b-4aec-9584-4868398f7bc0" containerID="258c08383d96ccaf1473f51f6419755905df5bc78e4b8f2efe11bb4247af1f9e" exitCode=0 Mar 18 17:16:52.000364 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:52.000219 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" Mar 18 17:16:52.000364 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:52.000222 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" event={"ID":"0af702de-e71b-4aec-9584-4868398f7bc0","Type":"ContainerDied","Data":"258c08383d96ccaf1473f51f6419755905df5bc78e4b8f2efe11bb4247af1f9e"} Mar 18 17:16:52.000364 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:52.000324 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5" event={"ID":"0af702de-e71b-4aec-9584-4868398f7bc0","Type":"ContainerDied","Data":"42beb4f59e63a6386a4c2dbaadeaf81dfb24947f1cc9319366699454da6d1cfc"} Mar 18 17:16:52.000364 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:52.000339 2571 scope.go:117] "RemoveContainer" containerID="258c08383d96ccaf1473f51f6419755905df5bc78e4b8f2efe11bb4247af1f9e" Mar 18 17:16:52.009211 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:52.009192 2571 scope.go:117] "RemoveContainer" containerID="258c08383d96ccaf1473f51f6419755905df5bc78e4b8f2efe11bb4247af1f9e" Mar 18 17:16:52.009497 ip-10-0-139-43 kubenswrapper[2571]: E0318 17:16:52.009472 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"258c08383d96ccaf1473f51f6419755905df5bc78e4b8f2efe11bb4247af1f9e\": container with ID starting with 258c08383d96ccaf1473f51f6419755905df5bc78e4b8f2efe11bb4247af1f9e not found: ID does not exist" containerID="258c08383d96ccaf1473f51f6419755905df5bc78e4b8f2efe11bb4247af1f9e" Mar 18 17:16:52.009558 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:52.009510 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"258c08383d96ccaf1473f51f6419755905df5bc78e4b8f2efe11bb4247af1f9e"} err="failed to get container status \"258c08383d96ccaf1473f51f6419755905df5bc78e4b8f2efe11bb4247af1f9e\": rpc error: code = NotFound desc = could not find container \"258c08383d96ccaf1473f51f6419755905df5bc78e4b8f2efe11bb4247af1f9e\": container with ID starting with 258c08383d96ccaf1473f51f6419755905df5bc78e4b8f2efe11bb4247af1f9e not found: ID does not exist" Mar 18 17:16:52.021853 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:52.021827 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5"] Mar 18 17:16:52.023122 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:52.023103 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-8d83a-predictor-6d6f8878fb-tbwp5"] Mar 18 17:16:52.125681 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:52.125647 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0af702de-e71b-4aec-9584-4868398f7bc0" path="/var/lib/kubelet/pods/0af702de-e71b-4aec-9584-4868398f7bc0/volumes" Mar 18 17:16:55.982548 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:55.982506 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" podUID="7c139f50-4575-43f9-b8ee-3b00be7c1b44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Mar 18 17:16:56.854329 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:16:56.854289 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" podUID="bcc10674-da0b-4889-8b6f-5f40931c86e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Mar 18 17:17:05.981728 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:05.981626 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" podUID="7c139f50-4575-43f9-b8ee-3b00be7c1b44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Mar 18 17:17:06.856169 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:06.856142 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" Mar 18 17:17:15.981841 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:15.981795 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" podUID="7c139f50-4575-43f9-b8ee-3b00be7c1b44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Mar 18 17:17:25.982563 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:25.982510 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" podUID="7c139f50-4575-43f9-b8ee-3b00be7c1b44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Mar 18 17:17:35.160530 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:35.160496 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw"] Mar 18 17:17:35.161023 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:35.160814 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" podUID="bcc10674-da0b-4889-8b6f-5f40931c86e3" containerName="kserve-container" containerID="cri-o://a09794bdc15c28c0f129d88f182f82670f8850b3fe952b872f3fd6c04d836f8e" gracePeriod=30 Mar 18 17:17:35.346883 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:35.346843 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs"] Mar 18 17:17:35.347268 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:35.347250 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0af702de-e71b-4aec-9584-4868398f7bc0" containerName="kserve-container" Mar 18 17:17:35.347355 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:35.347271 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af702de-e71b-4aec-9584-4868398f7bc0" containerName="kserve-container" Mar 18 17:17:35.347406 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:35.347365 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0af702de-e71b-4aec-9584-4868398f7bc0" containerName="kserve-container" Mar 18 17:17:35.351808 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:35.351784 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" Mar 18 17:17:35.365136 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:35.365112 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs"] Mar 18 17:17:35.484139 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:35.484052 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2pv5\" (UniqueName: \"kubernetes.io/projected/10c64850-8784-44ca-a6cd-26d488fa32fe-kube-api-access-l2pv5\") pod \"error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs\" (UID: \"10c64850-8784-44ca-a6cd-26d488fa32fe\") " pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" Mar 18 17:17:35.584954 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:35.584911 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2pv5\" (UniqueName: \"kubernetes.io/projected/10c64850-8784-44ca-a6cd-26d488fa32fe-kube-api-access-l2pv5\") pod \"error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs\" (UID: \"10c64850-8784-44ca-a6cd-26d488fa32fe\") " pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" Mar 18 17:17:35.593045 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:35.593011 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2pv5\" (UniqueName: \"kubernetes.io/projected/10c64850-8784-44ca-a6cd-26d488fa32fe-kube-api-access-l2pv5\") pod \"error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs\" (UID: \"10c64850-8784-44ca-a6cd-26d488fa32fe\") " pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" Mar 18 17:17:35.661699 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:35.661657 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" Mar 18 17:17:35.785021 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:35.784966 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs"] Mar 18 17:17:35.788675 ip-10-0-139-43 kubenswrapper[2571]: W0318 17:17:35.788645 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c64850_8784_44ca_a6cd_26d488fa32fe.slice/crio-cee80d2d4170a062d508c30701dc44524f678c73e598e94f1f045c3492e8d565 WatchSource:0}: Error finding container cee80d2d4170a062d508c30701dc44524f678c73e598e94f1f045c3492e8d565: Status 404 returned error can't find the container with id cee80d2d4170a062d508c30701dc44524f678c73e598e94f1f045c3492e8d565 Mar 18 17:17:35.981728 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:35.981679 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" podUID="7c139f50-4575-43f9-b8ee-3b00be7c1b44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Mar 18 17:17:36.133453 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:36.133357 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" event={"ID":"10c64850-8784-44ca-a6cd-26d488fa32fe","Type":"ContainerStarted","Data":"374bdff9aa996003feccea1d0f07607e94e170418e86853a009488106134421a"} Mar 18 17:17:36.133453 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:36.133393 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" event={"ID":"10c64850-8784-44ca-a6cd-26d488fa32fe","Type":"ContainerStarted","Data":"cee80d2d4170a062d508c30701dc44524f678c73e598e94f1f045c3492e8d565"} Mar 18 17:17:36.133680 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:36.133546 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" Mar 18 17:17:36.134962 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:36.134938 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" podUID="10c64850-8784-44ca-a6cd-26d488fa32fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 17:17:36.148555 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:36.148328 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" podStartSLOduration=1.148310584 podStartE2EDuration="1.148310584s" podCreationTimestamp="2026-03-18 17:17:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:17:36.147829524 +0000 UTC m=+1996.677702708" watchObservedRunningTime="2026-03-18 17:17:36.148310584 +0000 UTC m=+1996.678183765" Mar 18 17:17:36.854819 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:36.854777 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" podUID="bcc10674-da0b-4889-8b6f-5f40931c86e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Mar 18 17:17:37.136646 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:37.136559 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" podUID="10c64850-8784-44ca-a6cd-26d488fa32fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 17:17:38.395149 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:38.395124 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" Mar 18 17:17:38.509230 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:38.509149 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlc6f\" (UniqueName: \"kubernetes.io/projected/bcc10674-da0b-4889-8b6f-5f40931c86e3-kube-api-access-rlc6f\") pod \"bcc10674-da0b-4889-8b6f-5f40931c86e3\" (UID: \"bcc10674-da0b-4889-8b6f-5f40931c86e3\") " Mar 18 17:17:38.511193 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:38.511164 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc10674-da0b-4889-8b6f-5f40931c86e3-kube-api-access-rlc6f" (OuterVolumeSpecName: "kube-api-access-rlc6f") pod "bcc10674-da0b-4889-8b6f-5f40931c86e3" (UID: "bcc10674-da0b-4889-8b6f-5f40931c86e3"). InnerVolumeSpecName "kube-api-access-rlc6f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:17:38.609751 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:38.609703 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rlc6f\" (UniqueName: \"kubernetes.io/projected/bcc10674-da0b-4889-8b6f-5f40931c86e3-kube-api-access-rlc6f\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 17:17:39.141904 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:39.141866 2571 generic.go:358] "Generic (PLEG): container finished" podID="bcc10674-da0b-4889-8b6f-5f40931c86e3" containerID="a09794bdc15c28c0f129d88f182f82670f8850b3fe952b872f3fd6c04d836f8e" exitCode=0 Mar 18 17:17:39.142113 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:39.141920 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" Mar 18 17:17:39.142113 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:39.142009 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" event={"ID":"bcc10674-da0b-4889-8b6f-5f40931c86e3","Type":"ContainerDied","Data":"a09794bdc15c28c0f129d88f182f82670f8850b3fe952b872f3fd6c04d836f8e"} Mar 18 17:17:39.142113 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:39.142045 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw" event={"ID":"bcc10674-da0b-4889-8b6f-5f40931c86e3","Type":"ContainerDied","Data":"8b7338891425bae0af675b753f11a451210200619748e51bc82257aa421b62a0"} Mar 18 17:17:39.142113 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:39.142061 2571 scope.go:117] "RemoveContainer" containerID="a09794bdc15c28c0f129d88f182f82670f8850b3fe952b872f3fd6c04d836f8e" Mar 18 17:17:39.152247 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:39.152231 2571 scope.go:117] "RemoveContainer" containerID="a09794bdc15c28c0f129d88f182f82670f8850b3fe952b872f3fd6c04d836f8e" Mar 18 17:17:39.152498 ip-10-0-139-43 kubenswrapper[2571]: E0318 17:17:39.152479 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09794bdc15c28c0f129d88f182f82670f8850b3fe952b872f3fd6c04d836f8e\": container with ID starting with a09794bdc15c28c0f129d88f182f82670f8850b3fe952b872f3fd6c04d836f8e not found: ID does not exist" containerID="a09794bdc15c28c0f129d88f182f82670f8850b3fe952b872f3fd6c04d836f8e" Mar 18 17:17:39.152543 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:39.152508 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09794bdc15c28c0f129d88f182f82670f8850b3fe952b872f3fd6c04d836f8e"} err="failed to get container status \"a09794bdc15c28c0f129d88f182f82670f8850b3fe952b872f3fd6c04d836f8e\": rpc error: code = NotFound desc = could not find container \"a09794bdc15c28c0f129d88f182f82670f8850b3fe952b872f3fd6c04d836f8e\": container with ID starting with a09794bdc15c28c0f129d88f182f82670f8850b3fe952b872f3fd6c04d836f8e not found: ID does not exist" Mar 18 17:17:39.163212 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:39.163186 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw"] Mar 18 17:17:39.164850 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:39.164830 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-0c4f8-predictor-5fc4bc99fd-spxdw"] Mar 18 17:17:40.126344 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:40.126308 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcc10674-da0b-4889-8b6f-5f40931c86e3" path="/var/lib/kubelet/pods/bcc10674-da0b-4889-8b6f-5f40931c86e3/volumes" Mar 18 17:17:45.983184 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:45.983148 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" Mar 18 17:17:47.137339 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:47.137291 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" podUID="10c64850-8784-44ca-a6cd-26d488fa32fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 17:17:57.136644 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:17:57.136599 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" podUID="10c64850-8784-44ca-a6cd-26d488fa32fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 17:18:07.136917 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:18:07.136868 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" podUID="10c64850-8784-44ca-a6cd-26d488fa32fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 17:18:17.136812 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:18:17.136770 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" podUID="10c64850-8784-44ca-a6cd-26d488fa32fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 17:18:27.137558 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:18:27.137509 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" podUID="10c64850-8784-44ca-a6cd-26d488fa32fe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Mar 18 17:18:37.138105 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:18:37.138021 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" Mar 18 17:27:10.086668 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:27:10.086636 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs"] Mar 18 17:27:10.088943 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:27:10.086876 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" podUID="10c64850-8784-44ca-a6cd-26d488fa32fe" containerName="kserve-container" containerID="cri-o://374bdff9aa996003feccea1d0f07607e94e170418e86853a009488106134421a" gracePeriod=30 Mar 18 17:27:13.423611 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:27:13.423586 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" Mar 18 17:27:13.452948 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:27:13.452863 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2pv5\" (UniqueName: \"kubernetes.io/projected/10c64850-8784-44ca-a6cd-26d488fa32fe-kube-api-access-l2pv5\") pod \"10c64850-8784-44ca-a6cd-26d488fa32fe\" (UID: \"10c64850-8784-44ca-a6cd-26d488fa32fe\") " Mar 18 17:27:13.455451 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:27:13.455419 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c64850-8784-44ca-a6cd-26d488fa32fe-kube-api-access-l2pv5" (OuterVolumeSpecName: "kube-api-access-l2pv5") pod "10c64850-8784-44ca-a6cd-26d488fa32fe" (UID: "10c64850-8784-44ca-a6cd-26d488fa32fe"). InnerVolumeSpecName "kube-api-access-l2pv5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:27:13.554355 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:27:13.554322 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l2pv5\" (UniqueName: \"kubernetes.io/projected/10c64850-8784-44ca-a6cd-26d488fa32fe-kube-api-access-l2pv5\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 17:27:13.841534 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:27:13.841451 2571 generic.go:358] "Generic (PLEG): container finished" podID="10c64850-8784-44ca-a6cd-26d488fa32fe" containerID="374bdff9aa996003feccea1d0f07607e94e170418e86853a009488106134421a" exitCode=0 Mar 18 17:27:13.841534 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:27:13.841507 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" Mar 18 17:27:13.841728 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:27:13.841533 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" event={"ID":"10c64850-8784-44ca-a6cd-26d488fa32fe","Type":"ContainerDied","Data":"374bdff9aa996003feccea1d0f07607e94e170418e86853a009488106134421a"} Mar 18 17:27:13.841728 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:27:13.841574 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs" event={"ID":"10c64850-8784-44ca-a6cd-26d488fa32fe","Type":"ContainerDied","Data":"cee80d2d4170a062d508c30701dc44524f678c73e598e94f1f045c3492e8d565"} Mar 18 17:27:13.841728 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:27:13.841590 2571 scope.go:117] "RemoveContainer" containerID="374bdff9aa996003feccea1d0f07607e94e170418e86853a009488106134421a" Mar 18 17:27:13.850122 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:27:13.850099 2571 scope.go:117] "RemoveContainer" containerID="374bdff9aa996003feccea1d0f07607e94e170418e86853a009488106134421a" Mar 18 17:27:13.850392 ip-10-0-139-43 kubenswrapper[2571]: E0318 17:27:13.850373 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"374bdff9aa996003feccea1d0f07607e94e170418e86853a009488106134421a\": container with ID starting with 374bdff9aa996003feccea1d0f07607e94e170418e86853a009488106134421a not found: ID does not exist" containerID="374bdff9aa996003feccea1d0f07607e94e170418e86853a009488106134421a" Mar 18 17:27:13.850449 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:27:13.850401 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"374bdff9aa996003feccea1d0f07607e94e170418e86853a009488106134421a"} err="failed to get container status \"374bdff9aa996003feccea1d0f07607e94e170418e86853a009488106134421a\": rpc error: code = NotFound desc = could not find container \"374bdff9aa996003feccea1d0f07607e94e170418e86853a009488106134421a\": container with ID starting with 374bdff9aa996003feccea1d0f07607e94e170418e86853a009488106134421a not found: ID does not exist" Mar 18 17:27:13.860623 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:27:13.860594 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs"] Mar 18 17:27:13.863788 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:27:13.863766 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-6eae8-predictor-77dd5796b6-zz7vs"] Mar 18 17:27:14.125629 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:27:14.125543 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c64850-8784-44ca-a6cd-26d488fa32fe" path="/var/lib/kubelet/pods/10c64850-8784-44ca-a6cd-26d488fa32fe/volumes" Mar 18 17:34:23.791865 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:23.791831 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r"] Mar 18 17:34:23.794575 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:23.792103 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" podUID="7c139f50-4575-43f9-b8ee-3b00be7c1b44" containerName="kserve-container" containerID="cri-o://69001f036334db45d8d6a8ab50eafc15dbb15bebb7983346a372215674ba12bf" gracePeriod=30 Mar 18 17:34:24.622766 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.622731 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xpk7m/must-gather-985sf"] Mar 18 17:34:24.623072 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.623059 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bcc10674-da0b-4889-8b6f-5f40931c86e3" containerName="kserve-container" Mar 18 17:34:24.623181 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.623074 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc10674-da0b-4889-8b6f-5f40931c86e3" containerName="kserve-container" Mar 18 17:34:24.623181 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.623091 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10c64850-8784-44ca-a6cd-26d488fa32fe" containerName="kserve-container" Mar 18 17:34:24.623181 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.623096 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c64850-8784-44ca-a6cd-26d488fa32fe" containerName="kserve-container" Mar 18 17:34:24.623181 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.623162 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="10c64850-8784-44ca-a6cd-26d488fa32fe" containerName="kserve-container" Mar 18 17:34:24.623181 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.623170 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="bcc10674-da0b-4889-8b6f-5f40931c86e3" containerName="kserve-container" Mar 18 17:34:24.626266 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.626250 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xpk7m/must-gather-985sf" Mar 18 17:34:24.628271 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.628248 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xpk7m\"/\"openshift-service-ca.crt\"" Mar 18 17:34:24.628993 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.628950 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xpk7m\"/\"kube-root-ca.crt\"" Mar 18 17:34:24.628993 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.628967 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xpk7m\"/\"default-dockercfg-qtwxw\"" Mar 18 17:34:24.640989 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.640933 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xpk7m/must-gather-985sf"] Mar 18 17:34:24.743991 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.743942 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23bf5d95-a532-4acc-9a4d-8687025be61b-must-gather-output\") pod \"must-gather-985sf\" (UID: \"23bf5d95-a532-4acc-9a4d-8687025be61b\") " pod="openshift-must-gather-xpk7m/must-gather-985sf" Mar 18 17:34:24.744178 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.744086 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzzn9\" (UniqueName: \"kubernetes.io/projected/23bf5d95-a532-4acc-9a4d-8687025be61b-kube-api-access-hzzn9\") pod \"must-gather-985sf\" (UID: \"23bf5d95-a532-4acc-9a4d-8687025be61b\") " pod="openshift-must-gather-xpk7m/must-gather-985sf" Mar 18 17:34:24.844608 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.844568 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23bf5d95-a532-4acc-9a4d-8687025be61b-must-gather-output\") pod \"must-gather-985sf\" (UID: \"23bf5d95-a532-4acc-9a4d-8687025be61b\") " pod="openshift-must-gather-xpk7m/must-gather-985sf" Mar 18 17:34:24.845125 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.844741 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzzn9\" (UniqueName: \"kubernetes.io/projected/23bf5d95-a532-4acc-9a4d-8687025be61b-kube-api-access-hzzn9\") pod \"must-gather-985sf\" (UID: \"23bf5d95-a532-4acc-9a4d-8687025be61b\") " pod="openshift-must-gather-xpk7m/must-gather-985sf" Mar 18 17:34:24.845125 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.845022 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23bf5d95-a532-4acc-9a4d-8687025be61b-must-gather-output\") pod \"must-gather-985sf\" (UID: \"23bf5d95-a532-4acc-9a4d-8687025be61b\") " pod="openshift-must-gather-xpk7m/must-gather-985sf" Mar 18 17:34:24.853381 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.853361 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzzn9\" (UniqueName: \"kubernetes.io/projected/23bf5d95-a532-4acc-9a4d-8687025be61b-kube-api-access-hzzn9\") pod \"must-gather-985sf\" (UID: \"23bf5d95-a532-4acc-9a4d-8687025be61b\") " pod="openshift-must-gather-xpk7m/must-gather-985sf" Mar 18 17:34:24.947050 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:24.947013 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xpk7m/must-gather-985sf" Mar 18 17:34:25.067397 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:25.067375 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xpk7m/must-gather-985sf"] Mar 18 17:34:25.069509 ip-10-0-139-43 kubenswrapper[2571]: W0318 17:34:25.069469 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23bf5d95_a532_4acc_9a4d_8687025be61b.slice/crio-f5536d5ee23d671ce1cc27e9b1c265192343a8894fafc2b676c19f87f231cd56 WatchSource:0}: Error finding container f5536d5ee23d671ce1cc27e9b1c265192343a8894fafc2b676c19f87f231cd56: Status 404 returned error can't find the container with id f5536d5ee23d671ce1cc27e9b1c265192343a8894fafc2b676c19f87f231cd56 Mar 18 17:34:25.071360 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:25.071344 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:34:25.120675 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:25.120640 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xpk7m/must-gather-985sf" event={"ID":"23bf5d95-a532-4acc-9a4d-8687025be61b","Type":"ContainerStarted","Data":"f5536d5ee23d671ce1cc27e9b1c265192343a8894fafc2b676c19f87f231cd56"} Mar 18 17:34:25.982151 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:25.982109 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" podUID="7c139f50-4575-43f9-b8ee-3b00be7c1b44" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Mar 18 17:34:27.129033 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:27.128995 2571 generic.go:358] "Generic (PLEG): container finished" podID="7c139f50-4575-43f9-b8ee-3b00be7c1b44" containerID="69001f036334db45d8d6a8ab50eafc15dbb15bebb7983346a372215674ba12bf" exitCode=0 Mar 18 17:34:27.129512 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:27.129003 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" event={"ID":"7c139f50-4575-43f9-b8ee-3b00be7c1b44","Type":"ContainerDied","Data":"69001f036334db45d8d6a8ab50eafc15dbb15bebb7983346a372215674ba12bf"} Mar 18 17:34:27.167759 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:27.167735 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" Mar 18 17:34:27.266199 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:27.266166 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxkjm\" (UniqueName: \"kubernetes.io/projected/7c139f50-4575-43f9-b8ee-3b00be7c1b44-kube-api-access-zxkjm\") pod \"7c139f50-4575-43f9-b8ee-3b00be7c1b44\" (UID: \"7c139f50-4575-43f9-b8ee-3b00be7c1b44\") " Mar 18 17:34:27.268821 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:27.268768 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c139f50-4575-43f9-b8ee-3b00be7c1b44-kube-api-access-zxkjm" (OuterVolumeSpecName: "kube-api-access-zxkjm") pod "7c139f50-4575-43f9-b8ee-3b00be7c1b44" (UID: "7c139f50-4575-43f9-b8ee-3b00be7c1b44"). InnerVolumeSpecName "kube-api-access-zxkjm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:34:27.367281 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:27.367250 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zxkjm\" (UniqueName: \"kubernetes.io/projected/7c139f50-4575-43f9-b8ee-3b00be7c1b44-kube-api-access-zxkjm\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 17:34:28.133606 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:28.133572 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" event={"ID":"7c139f50-4575-43f9-b8ee-3b00be7c1b44","Type":"ContainerDied","Data":"918b4b5bb4b9926f9b63ecd08192dd6cb3affe6ea6fbc703a96170ea1354e592"} Mar 18 17:34:28.133932 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:28.133625 2571 scope.go:117] "RemoveContainer" containerID="69001f036334db45d8d6a8ab50eafc15dbb15bebb7983346a372215674ba12bf" Mar 18 17:34:28.133932 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:28.133625 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r" Mar 18 17:34:28.150480 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:28.150443 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r"] Mar 18 17:34:28.155331 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:28.155306 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-eb922-predictor-6b476995dd-bh64r"] Mar 18 17:34:30.128109 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:30.128071 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c139f50-4575-43f9-b8ee-3b00be7c1b44" path="/var/lib/kubelet/pods/7c139f50-4575-43f9-b8ee-3b00be7c1b44/volumes" Mar 18 17:34:31.147547 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:31.147509 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xpk7m/must-gather-985sf" event={"ID":"23bf5d95-a532-4acc-9a4d-8687025be61b","Type":"ContainerStarted","Data":"2fa9d2d63a5e43582918d16edebc25d48a813090350d8d6beddcb674b4ccc50a"} Mar 18 17:34:31.147547 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:31.147552 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xpk7m/must-gather-985sf" event={"ID":"23bf5d95-a532-4acc-9a4d-8687025be61b","Type":"ContainerStarted","Data":"fd6a4560f3265a64fbd3020a7f2be06b7c173dea710d5af11b62d959490c11dc"} Mar 18 17:34:31.164424 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:31.164376 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xpk7m/must-gather-985sf" podStartSLOduration=1.736182263 podStartE2EDuration="7.16436046s" podCreationTimestamp="2026-03-18 17:34:24 +0000 UTC" firstStartedPulling="2026-03-18 17:34:25.07146639 +0000 UTC m=+3005.601339550" lastFinishedPulling="2026-03-18 17:34:30.499644585 +0000 UTC m=+3011.029517747" observedRunningTime="2026-03-18 17:34:31.162856174 +0000 UTC m=+3011.692729356" watchObservedRunningTime="2026-03-18 17:34:31.16436046 +0000 UTC m=+3011.694233641" Mar 18 17:34:51.219570 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:51.219529 2571 generic.go:358] "Generic (PLEG): container finished" podID="23bf5d95-a532-4acc-9a4d-8687025be61b" containerID="fd6a4560f3265a64fbd3020a7f2be06b7c173dea710d5af11b62d959490c11dc" exitCode=0 Mar 18 17:34:51.220019 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:51.219589 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xpk7m/must-gather-985sf" event={"ID":"23bf5d95-a532-4acc-9a4d-8687025be61b","Type":"ContainerDied","Data":"fd6a4560f3265a64fbd3020a7f2be06b7c173dea710d5af11b62d959490c11dc"} Mar 18 17:34:51.220019 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:51.219917 2571 scope.go:117] "RemoveContainer" containerID="fd6a4560f3265a64fbd3020a7f2be06b7c173dea710d5af11b62d959490c11dc" Mar 18 17:34:51.368126 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:51.368092 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xpk7m_must-gather-985sf_23bf5d95-a532-4acc-9a4d-8687025be61b/gather/0.log" Mar 18 17:34:51.933196 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:51.933155 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gffj5/must-gather-dz65l"] Mar 18 17:34:51.933488 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:51.933477 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c139f50-4575-43f9-b8ee-3b00be7c1b44" containerName="kserve-container" Mar 18 17:34:51.933532 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:51.933490 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c139f50-4575-43f9-b8ee-3b00be7c1b44" containerName="kserve-container" Mar 18 17:34:51.933572 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:51.933540 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="7c139f50-4575-43f9-b8ee-3b00be7c1b44" containerName="kserve-container" Mar 18 17:34:51.936586 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:51.936566 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gffj5/must-gather-dz65l" Mar 18 17:34:51.945382 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:51.945362 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gffj5\"/\"kube-root-ca.crt\"" Mar 18 17:34:51.945496 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:51.945368 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gffj5\"/\"default-dockercfg-4qhj7\"" Mar 18 17:34:51.945555 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:51.945524 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gffj5\"/\"openshift-service-ca.crt\"" Mar 18 17:34:51.950798 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:51.950774 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gffj5/must-gather-dz65l"] Mar 18 17:34:52.085221 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:52.085187 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fthrz\" (UniqueName: \"kubernetes.io/projected/70a48373-062b-43f5-9dea-d18afd2295fe-kube-api-access-fthrz\") pod \"must-gather-dz65l\" (UID: \"70a48373-062b-43f5-9dea-d18afd2295fe\") " pod="openshift-must-gather-gffj5/must-gather-dz65l" Mar 18 17:34:52.085390 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:52.085286 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70a48373-062b-43f5-9dea-d18afd2295fe-must-gather-output\") pod \"must-gather-dz65l\" (UID: \"70a48373-062b-43f5-9dea-d18afd2295fe\") " pod="openshift-must-gather-gffj5/must-gather-dz65l" Mar 18 17:34:52.186494 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:52.186402 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70a48373-062b-43f5-9dea-d18afd2295fe-must-gather-output\") pod \"must-gather-dz65l\" (UID: \"70a48373-062b-43f5-9dea-d18afd2295fe\") " pod="openshift-must-gather-gffj5/must-gather-dz65l" Mar 18 17:34:52.186494 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:52.186451 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fthrz\" (UniqueName: \"kubernetes.io/projected/70a48373-062b-43f5-9dea-d18afd2295fe-kube-api-access-fthrz\") pod \"must-gather-dz65l\" (UID: \"70a48373-062b-43f5-9dea-d18afd2295fe\") " pod="openshift-must-gather-gffj5/must-gather-dz65l" Mar 18 17:34:52.186750 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:52.186731 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70a48373-062b-43f5-9dea-d18afd2295fe-must-gather-output\") pod \"must-gather-dz65l\" (UID: \"70a48373-062b-43f5-9dea-d18afd2295fe\") " pod="openshift-must-gather-gffj5/must-gather-dz65l" Mar 18 17:34:52.194701 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:52.194670 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fthrz\" (UniqueName: \"kubernetes.io/projected/70a48373-062b-43f5-9dea-d18afd2295fe-kube-api-access-fthrz\") pod \"must-gather-dz65l\" (UID: \"70a48373-062b-43f5-9dea-d18afd2295fe\") " pod="openshift-must-gather-gffj5/must-gather-dz65l" Mar 18 17:34:52.245712 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:52.245681 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gffj5/must-gather-dz65l" Mar 18 17:34:52.366077 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:52.365926 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gffj5/must-gather-dz65l"] Mar 18 17:34:52.368756 ip-10-0-139-43 kubenswrapper[2571]: W0318 17:34:52.368719 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70a48373_062b_43f5_9dea_d18afd2295fe.slice/crio-3b8c61c2593a12322805945a66b2401e5c6bd0a38a1698854a73a5f061f8cc30 WatchSource:0}: Error finding container 3b8c61c2593a12322805945a66b2401e5c6bd0a38a1698854a73a5f061f8cc30: Status 404 returned error can't find the container with id 3b8c61c2593a12322805945a66b2401e5c6bd0a38a1698854a73a5f061f8cc30 Mar 18 17:34:53.226163 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:53.226128 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gffj5/must-gather-dz65l" event={"ID":"70a48373-062b-43f5-9dea-d18afd2295fe","Type":"ContainerStarted","Data":"3b8c61c2593a12322805945a66b2401e5c6bd0a38a1698854a73a5f061f8cc30"} Mar 18 17:34:54.231816 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:54.231781 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gffj5/must-gather-dz65l" event={"ID":"70a48373-062b-43f5-9dea-d18afd2295fe","Type":"ContainerStarted","Data":"343678363d177ed0747733a52e313c66525215ed25eabc832ac2e2a43f7294fe"} Mar 18 17:34:54.231816 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:54.231820 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gffj5/must-gather-dz65l" event={"ID":"70a48373-062b-43f5-9dea-d18afd2295fe","Type":"ContainerStarted","Data":"2f3adff5bd57078a57f9bfd23df4ec589a6ca942180ed5849f145152eaa88da9"} Mar 18 17:34:54.249489 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:54.249418 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gffj5/must-gather-dz65l" podStartSLOduration=2.359285195 podStartE2EDuration="3.249388911s" podCreationTimestamp="2026-03-18 17:34:51 +0000 UTC" firstStartedPulling="2026-03-18 17:34:52.370742948 +0000 UTC m=+3032.900616112" lastFinishedPulling="2026-03-18 17:34:53.260846664 +0000 UTC m=+3033.790719828" observedRunningTime="2026-03-18 17:34:54.247770549 +0000 UTC m=+3034.777643757" watchObservedRunningTime="2026-03-18 17:34:54.249388911 +0000 UTC m=+3034.779262094" Mar 18 17:34:54.762306 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:54.762274 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-x942k_9addceab-d971-4f5c-a01e-acfdb1254fab/global-pull-secret-syncer/0.log" Mar 18 17:34:54.926186 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:54.926153 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-kv8gs_6d41cdd0-c3ee-43e6-84d1-051cad027162/konnectivity-agent/0.log" Mar 18 17:34:54.956992 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:54.956944 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-43.ec2.internal_cf17cac2b800afcc6c2fd3a4eb5e5a86/haproxy/0.log" Mar 18 17:34:56.764330 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:56.764291 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xpk7m/must-gather-985sf"] Mar 18 17:34:56.765270 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:56.765232 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-xpk7m/must-gather-985sf" podUID="23bf5d95-a532-4acc-9a4d-8687025be61b" containerName="copy" containerID="cri-o://2fa9d2d63a5e43582918d16edebc25d48a813090350d8d6beddcb674b4ccc50a" gracePeriod=2 Mar 18 17:34:56.766843 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:56.766390 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xpk7m/must-gather-985sf"] Mar 18 17:34:56.767535 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:56.767491 2571 status_manager.go:895] "Failed to get status for pod" podUID="23bf5d95-a532-4acc-9a4d-8687025be61b" pod="openshift-must-gather-xpk7m/must-gather-985sf" err="pods \"must-gather-985sf\" is forbidden: User \"system:node:ip-10-0-139-43.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xpk7m\": no relationship found between node 'ip-10-0-139-43.ec2.internal' and this object" Mar 18 17:34:57.153270 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.152460 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xpk7m_must-gather-985sf_23bf5d95-a532-4acc-9a4d-8687025be61b/copy/0.log" Mar 18 17:34:57.153270 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.152878 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xpk7m/must-gather-985sf" Mar 18 17:34:57.155584 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.155536 2571 status_manager.go:895] "Failed to get status for pod" podUID="23bf5d95-a532-4acc-9a4d-8687025be61b" pod="openshift-must-gather-xpk7m/must-gather-985sf" err="pods \"must-gather-985sf\" is forbidden: User \"system:node:ip-10-0-139-43.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xpk7m\": no relationship found between node 'ip-10-0-139-43.ec2.internal' and this object" Mar 18 17:34:57.254127 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.253197 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xpk7m_must-gather-985sf_23bf5d95-a532-4acc-9a4d-8687025be61b/copy/0.log" Mar 18 17:34:57.254127 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.253613 2571 generic.go:358] "Generic (PLEG): container finished" podID="23bf5d95-a532-4acc-9a4d-8687025be61b" containerID="2fa9d2d63a5e43582918d16edebc25d48a813090350d8d6beddcb674b4ccc50a" exitCode=143 Mar 18 17:34:57.254127 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.253700 2571 scope.go:117] "RemoveContainer" containerID="2fa9d2d63a5e43582918d16edebc25d48a813090350d8d6beddcb674b4ccc50a" Mar 18 17:34:57.254127 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.253752 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xpk7m/must-gather-985sf" Mar 18 17:34:57.260080 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.256167 2571 status_manager.go:895] "Failed to get status for pod" podUID="23bf5d95-a532-4acc-9a4d-8687025be61b" pod="openshift-must-gather-xpk7m/must-gather-985sf" err="pods \"must-gather-985sf\" is forbidden: User \"system:node:ip-10-0-139-43.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xpk7m\": no relationship found between node 'ip-10-0-139-43.ec2.internal' and this object" Mar 18 17:34:57.274780 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.273458 2571 scope.go:117] "RemoveContainer" containerID="fd6a4560f3265a64fbd3020a7f2be06b7c173dea710d5af11b62d959490c11dc" Mar 18 17:34:57.342961 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.342772 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzzn9\" (UniqueName: \"kubernetes.io/projected/23bf5d95-a532-4acc-9a4d-8687025be61b-kube-api-access-hzzn9\") pod \"23bf5d95-a532-4acc-9a4d-8687025be61b\" (UID: \"23bf5d95-a532-4acc-9a4d-8687025be61b\") " Mar 18 17:34:57.342961 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.342860 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23bf5d95-a532-4acc-9a4d-8687025be61b-must-gather-output\") pod \"23bf5d95-a532-4acc-9a4d-8687025be61b\" (UID: \"23bf5d95-a532-4acc-9a4d-8687025be61b\") " Mar 18 17:34:57.347139 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.345723 2571 scope.go:117] "RemoveContainer" containerID="2fa9d2d63a5e43582918d16edebc25d48a813090350d8d6beddcb674b4ccc50a" Mar 18 17:34:57.347139 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.346432 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23bf5d95-a532-4acc-9a4d-8687025be61b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "23bf5d95-a532-4acc-9a4d-8687025be61b" (UID: "23bf5d95-a532-4acc-9a4d-8687025be61b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:34:57.347139 ip-10-0-139-43 kubenswrapper[2571]: E0318 17:34:57.346469 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fa9d2d63a5e43582918d16edebc25d48a813090350d8d6beddcb674b4ccc50a\": container with ID starting with 2fa9d2d63a5e43582918d16edebc25d48a813090350d8d6beddcb674b4ccc50a not found: ID does not exist" containerID="2fa9d2d63a5e43582918d16edebc25d48a813090350d8d6beddcb674b4ccc50a" Mar 18 17:34:57.347139 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.346508 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fa9d2d63a5e43582918d16edebc25d48a813090350d8d6beddcb674b4ccc50a"} err="failed to get container status \"2fa9d2d63a5e43582918d16edebc25d48a813090350d8d6beddcb674b4ccc50a\": rpc error: code = NotFound desc = could not find container \"2fa9d2d63a5e43582918d16edebc25d48a813090350d8d6beddcb674b4ccc50a\": container with ID starting with 2fa9d2d63a5e43582918d16edebc25d48a813090350d8d6beddcb674b4ccc50a not found: ID does not exist" Mar 18 17:34:57.347139 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.346536 2571 scope.go:117] "RemoveContainer" containerID="fd6a4560f3265a64fbd3020a7f2be06b7c173dea710d5af11b62d959490c11dc" Mar 18 17:34:57.347415 ip-10-0-139-43 kubenswrapper[2571]: E0318 17:34:57.347363 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd6a4560f3265a64fbd3020a7f2be06b7c173dea710d5af11b62d959490c11dc\": container with ID starting with fd6a4560f3265a64fbd3020a7f2be06b7c173dea710d5af11b62d959490c11dc not found: ID does not exist" containerID="fd6a4560f3265a64fbd3020a7f2be06b7c173dea710d5af11b62d959490c11dc" Mar 18 17:34:57.347415 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.347392 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd6a4560f3265a64fbd3020a7f2be06b7c173dea710d5af11b62d959490c11dc"} err="failed to get container status \"fd6a4560f3265a64fbd3020a7f2be06b7c173dea710d5af11b62d959490c11dc\": rpc error: code = NotFound desc = could not find container \"fd6a4560f3265a64fbd3020a7f2be06b7c173dea710d5af11b62d959490c11dc\": container with ID starting with fd6a4560f3265a64fbd3020a7f2be06b7c173dea710d5af11b62d959490c11dc not found: ID does not exist" Mar 18 17:34:57.347415 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.347398 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23bf5d95-a532-4acc-9a4d-8687025be61b-kube-api-access-hzzn9" (OuterVolumeSpecName: "kube-api-access-hzzn9") pod "23bf5d95-a532-4acc-9a4d-8687025be61b" (UID: "23bf5d95-a532-4acc-9a4d-8687025be61b"). InnerVolumeSpecName "kube-api-access-hzzn9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:34:57.444358 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.444314 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hzzn9\" (UniqueName: \"kubernetes.io/projected/23bf5d95-a532-4acc-9a4d-8687025be61b-kube-api-access-hzzn9\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 17:34:57.444358 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.444359 2571 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23bf5d95-a532-4acc-9a4d-8687025be61b-must-gather-output\") on node \"ip-10-0-139-43.ec2.internal\" DevicePath \"\"" Mar 18 17:34:57.567428 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:57.567386 2571 status_manager.go:895] "Failed to get status for pod" podUID="23bf5d95-a532-4acc-9a4d-8687025be61b" pod="openshift-must-gather-xpk7m/must-gather-985sf" err="pods \"must-gather-985sf\" is forbidden: User \"system:node:ip-10-0-139-43.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xpk7m\": no relationship found between node 'ip-10-0-139-43.ec2.internal' and this object" Mar 18 17:34:58.129769 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:58.129666 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23bf5d95-a532-4acc-9a4d-8687025be61b" path="/var/lib/kubelet/pods/23bf5d95-a532-4acc-9a4d-8687025be61b/volumes" Mar 18 17:34:58.236598 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:58.236568 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_343eeaf2-6d7d-4ebc-9267-226d234372e5/alertmanager/0.log" Mar 18 17:34:58.271149 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:58.271084 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_343eeaf2-6d7d-4ebc-9267-226d234372e5/config-reloader/0.log" Mar 18 17:34:58.302291 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:58.302255 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_343eeaf2-6d7d-4ebc-9267-226d234372e5/kube-rbac-proxy-web/0.log" Mar 18 17:34:58.332153 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:58.332037 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_343eeaf2-6d7d-4ebc-9267-226d234372e5/kube-rbac-proxy/0.log" Mar 18 17:34:58.361275 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:58.361169 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_343eeaf2-6d7d-4ebc-9267-226d234372e5/kube-rbac-proxy-metric/0.log" Mar 18 17:34:58.387830 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:58.387570 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_343eeaf2-6d7d-4ebc-9267-226d234372e5/prom-label-proxy/0.log" Mar 18 17:34:58.421292 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:58.421259 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_343eeaf2-6d7d-4ebc-9267-226d234372e5/init-config-reloader/0.log" Mar 18 17:34:58.466380 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:58.466344 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-b58cd5d8d-2t9zt_f0e93606-444c-4b6a-8294-6603a2b534e8/cluster-monitoring-operator/0.log" Mar 18 17:34:58.498024 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:58.497967 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-6df7999c47-z2nll_03c91678-fee8-415c-9e15-85a3cedd1294/kube-state-metrics/0.log" Mar 18 17:34:58.528328 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:58.528301 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-6df7999c47-z2nll_03c91678-fee8-415c-9e15-85a3cedd1294/kube-rbac-proxy-main/0.log" Mar 18 17:34:58.557780 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:58.557747 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-6df7999c47-z2nll_03c91678-fee8-415c-9e15-85a3cedd1294/kube-rbac-proxy-self/0.log" Mar 18 17:34:58.759511 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:58.759264 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-82cfk_0d4a41e6-6f72-45dd-a626-762b2819804c/node-exporter/0.log" Mar 18 17:34:58.786457 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:58.786427 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-82cfk_0d4a41e6-6f72-45dd-a626-762b2819804c/kube-rbac-proxy/0.log" Mar 18 17:34:58.811731 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:58.811707 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-82cfk_0d4a41e6-6f72-45dd-a626-762b2819804c/init-textfile/0.log" Mar 18 17:34:59.034842 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:59.034756 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b/prometheus/0.log" Mar 18 17:34:59.054313 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:59.054284 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b/config-reloader/0.log" Mar 18 17:34:59.078214 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:59.078183 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b/thanos-sidecar/0.log" Mar 18 17:34:59.103755 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:59.103731 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b/kube-rbac-proxy-web/0.log" Mar 18 17:34:59.127633 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:59.127526 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b/kube-rbac-proxy/0.log" Mar 18 17:34:59.153338 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:59.153301 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b/kube-rbac-proxy-thanos/0.log" Mar 18 17:34:59.178114 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:59.178086 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c9f4dffc-a8b9-48e9-896e-9ff2abf4c12b/init-config-reloader/0.log" Mar 18 17:34:59.211108 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:59.211079 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-6b948c769-g598t_8909e55e-6aec-466e-8776-e284dd7ecf1f/prometheus-operator/0.log" Mar 18 17:34:59.231598 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:59.231573 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-6b948c769-g598t_8909e55e-6aec-466e-8776-e284dd7ecf1f/kube-rbac-proxy/0.log" Mar 18 17:34:59.259475 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:34:59.259445 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-8444df798b-vqvsg_14be01bd-4943-46df-a51e-deeac401e9e0/prometheus-operator-admission-webhook/0.log" Mar 18 17:35:02.514939 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.514902 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76"] Mar 18 17:35:02.515478 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.515431 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23bf5d95-a532-4acc-9a4d-8687025be61b" containerName="copy" Mar 18 17:35:02.515478 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.515453 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="23bf5d95-a532-4acc-9a4d-8687025be61b" containerName="copy" Mar 18 17:35:02.515596 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.515488 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23bf5d95-a532-4acc-9a4d-8687025be61b" containerName="gather" Mar 18 17:35:02.515596 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.515497 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="23bf5d95-a532-4acc-9a4d-8687025be61b" containerName="gather" Mar 18 17:35:02.515596 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.515572 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="23bf5d95-a532-4acc-9a4d-8687025be61b" containerName="copy" Mar 18 17:35:02.515596 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.515587 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="23bf5d95-a532-4acc-9a4d-8687025be61b" containerName="gather" Mar 18 17:35:02.520909 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.520885 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:02.527375 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.527350 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76"] Mar 18 17:35:02.699216 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.699176 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhpsn\" (UniqueName: \"kubernetes.io/projected/d3aa2c70-415d-4092-8936-f5cb9a09d4ab-kube-api-access-jhpsn\") pod \"perf-node-gather-daemonset-fmw76\" (UID: \"d3aa2c70-415d-4092-8936-f5cb9a09d4ab\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:02.699412 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.699222 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d3aa2c70-415d-4092-8936-f5cb9a09d4ab-proc\") pod \"perf-node-gather-daemonset-fmw76\" (UID: \"d3aa2c70-415d-4092-8936-f5cb9a09d4ab\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:02.699412 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.699296 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d3aa2c70-415d-4092-8936-f5cb9a09d4ab-podres\") pod \"perf-node-gather-daemonset-fmw76\" (UID: \"d3aa2c70-415d-4092-8936-f5cb9a09d4ab\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:02.699412 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.699372 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3aa2c70-415d-4092-8936-f5cb9a09d4ab-sys\") pod \"perf-node-gather-daemonset-fmw76\" (UID: \"d3aa2c70-415d-4092-8936-f5cb9a09d4ab\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:02.699412 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.699398 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3aa2c70-415d-4092-8936-f5cb9a09d4ab-lib-modules\") pod \"perf-node-gather-daemonset-fmw76\" (UID: \"d3aa2c70-415d-4092-8936-f5cb9a09d4ab\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:02.789611 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.789529 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zdwcx_bfa6872b-035c-45af-901b-55b2097c2b3d/dns/0.log" Mar 18 17:35:02.800448 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.800419 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d3aa2c70-415d-4092-8936-f5cb9a09d4ab-podres\") pod \"perf-node-gather-daemonset-fmw76\" (UID: \"d3aa2c70-415d-4092-8936-f5cb9a09d4ab\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:02.800621 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.800466 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3aa2c70-415d-4092-8936-f5cb9a09d4ab-sys\") pod \"perf-node-gather-daemonset-fmw76\" (UID: \"d3aa2c70-415d-4092-8936-f5cb9a09d4ab\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:02.800621 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.800528 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3aa2c70-415d-4092-8936-f5cb9a09d4ab-sys\") pod \"perf-node-gather-daemonset-fmw76\" (UID: \"d3aa2c70-415d-4092-8936-f5cb9a09d4ab\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:02.800621 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.800570 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3aa2c70-415d-4092-8936-f5cb9a09d4ab-lib-modules\") pod \"perf-node-gather-daemonset-fmw76\" (UID: \"d3aa2c70-415d-4092-8936-f5cb9a09d4ab\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:02.800621 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.800580 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d3aa2c70-415d-4092-8936-f5cb9a09d4ab-podres\") pod \"perf-node-gather-daemonset-fmw76\" (UID: \"d3aa2c70-415d-4092-8936-f5cb9a09d4ab\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:02.800770 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.800632 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhpsn\" (UniqueName: \"kubernetes.io/projected/d3aa2c70-415d-4092-8936-f5cb9a09d4ab-kube-api-access-jhpsn\") pod \"perf-node-gather-daemonset-fmw76\" (UID: \"d3aa2c70-415d-4092-8936-f5cb9a09d4ab\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:02.800770 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.800656 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d3aa2c70-415d-4092-8936-f5cb9a09d4ab-proc\") pod \"perf-node-gather-daemonset-fmw76\" (UID: \"d3aa2c70-415d-4092-8936-f5cb9a09d4ab\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:02.800770 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.800672 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3aa2c70-415d-4092-8936-f5cb9a09d4ab-lib-modules\") pod \"perf-node-gather-daemonset-fmw76\" (UID: \"d3aa2c70-415d-4092-8936-f5cb9a09d4ab\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:02.800770 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.800712 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d3aa2c70-415d-4092-8936-f5cb9a09d4ab-proc\") pod \"perf-node-gather-daemonset-fmw76\" (UID: \"d3aa2c70-415d-4092-8936-f5cb9a09d4ab\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:02.808421 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.808394 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhpsn\" (UniqueName: \"kubernetes.io/projected/d3aa2c70-415d-4092-8936-f5cb9a09d4ab-kube-api-access-jhpsn\") pod \"perf-node-gather-daemonset-fmw76\" (UID: \"d3aa2c70-415d-4092-8936-f5cb9a09d4ab\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:02.814104 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.814077 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zdwcx_bfa6872b-035c-45af-901b-55b2097c2b3d/kube-rbac-proxy/0.log" Mar 18 17:35:02.834702 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.834664 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:02.842408 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.842388 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fd4lp_17730a74-9827-4a73-be22-72ef96f3aeb0/dns-node-resolver/0.log" Mar 18 17:35:02.974651 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:02.974092 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76"] Mar 18 17:35:02.981967 ip-10-0-139-43 kubenswrapper[2571]: W0318 17:35:02.981934 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd3aa2c70_415d_4092_8936_f5cb9a09d4ab.slice/crio-13b2872946e167a24f1743aa6a79dd3768a95f27d7a3cb330aa01e561acff02e WatchSource:0}: Error finding container 13b2872946e167a24f1743aa6a79dd3768a95f27d7a3cb330aa01e561acff02e: Status 404 returned error can't find the container with id 13b2872946e167a24f1743aa6a79dd3768a95f27d7a3cb330aa01e561acff02e Mar 18 17:35:03.279270 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:03.279240 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" event={"ID":"d3aa2c70-415d-4092-8936-f5cb9a09d4ab","Type":"ContainerStarted","Data":"5ee3f33449e9a2bca8d3d8b77853d5178e4c47b37fc5386e5ec04eed9f54555c"} Mar 18 17:35:03.279270 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:03.279276 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" event={"ID":"d3aa2c70-415d-4092-8936-f5cb9a09d4ab","Type":"ContainerStarted","Data":"13b2872946e167a24f1743aa6a79dd3768a95f27d7a3cb330aa01e561acff02e"} Mar 18 17:35:03.279506 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:03.279369 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:03.297199 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:03.297134 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" podStartSLOduration=1.29711231 podStartE2EDuration="1.29711231s" podCreationTimestamp="2026-03-18 17:35:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:35:03.29439702 +0000 UTC m=+3043.824270230" watchObservedRunningTime="2026-03-18 17:35:03.29711231 +0000 UTC m=+3043.826985493" Mar 18 17:35:03.427884 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:03.427847 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rwpwz_8197d1a6-4bd6-4f2e-9e43-e0bd2dff50ae/node-ca/0.log" Mar 18 17:35:04.151942 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:04.151909 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6785879c4-rgndg_09d9a175-36dd-4900-92c2-7b9a0986e68d/router/0.log" Mar 18 17:35:04.476700 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:04.476628 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-98z82_54e235df-2ad8-4fbe-81bc-dfb66eafbf2b/serve-healthcheck-canary/0.log" Mar 18 17:35:05.061111 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:05.061076 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-k622p_517e2087-dc4a-4eb9-a827-7a1760e0d5a2/kube-rbac-proxy/0.log" Mar 18 17:35:05.084106 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:05.084080 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-k622p_517e2087-dc4a-4eb9-a827-7a1760e0d5a2/exporter/0.log" Mar 18 17:35:05.106955 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:05.106928 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-k622p_517e2087-dc4a-4eb9-a827-7a1760e0d5a2/extractor/0.log" Mar 18 17:35:07.617625 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:07.617593 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-g8c44_882a0a1f-b512-4373-8265-28e77cc3c62a/s3-init/0.log" Mar 18 17:35:09.292858 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:09.292828 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-fmw76" Mar 18 17:35:11.860735 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:11.860640 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-866f46547-mr5cf_71b1e3b5-f3de-4e5a-855d-d72d883b476f/kube-storage-version-migrator-operator/1.log" Mar 18 17:35:11.862153 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:11.862126 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-866f46547-mr5cf_71b1e3b5-f3de-4e5a-855d-d72d883b476f/kube-storage-version-migrator-operator/0.log" Mar 18 17:35:12.868388 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:12.868355 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fpm5p_d2c29efd-c4cf-40cc-91bb-1c82e76eea41/kube-multus-additional-cni-plugins/0.log" Mar 18 17:35:12.894834 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:12.894806 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fpm5p_d2c29efd-c4cf-40cc-91bb-1c82e76eea41/egress-router-binary-copy/0.log" Mar 18 17:35:12.917289 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:12.917258 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fpm5p_d2c29efd-c4cf-40cc-91bb-1c82e76eea41/cni-plugins/0.log" Mar 18 17:35:12.940205 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:12.940177 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fpm5p_d2c29efd-c4cf-40cc-91bb-1c82e76eea41/bond-cni-plugin/0.log" Mar 18 17:35:12.964673 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:12.964630 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fpm5p_d2c29efd-c4cf-40cc-91bb-1c82e76eea41/routeoverride-cni/0.log" Mar 18 17:35:12.988602 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:12.988577 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fpm5p_d2c29efd-c4cf-40cc-91bb-1c82e76eea41/whereabouts-cni-bincopy/0.log" Mar 18 17:35:13.012851 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:13.012825 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fpm5p_d2c29efd-c4cf-40cc-91bb-1c82e76eea41/whereabouts-cni/0.log" Mar 18 17:35:13.427863 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:13.427824 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lfgl5_3fc7c75c-9d88-4d69-a623-1eb256939d93/kube-multus/0.log" Mar 18 17:35:13.504014 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:13.503984 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jf76n_bf79a021-e091-4ae2-bd19-2bd1205de781/network-metrics-daemon/0.log" Mar 18 17:35:13.527953 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:13.527925 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jf76n_bf79a021-e091-4ae2-bd19-2bd1205de781/kube-rbac-proxy/0.log" Mar 18 17:35:14.841719 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:14.841667 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m9kqz_ee65363f-46b9-4194-8b09-6d6e1a39303b/ovn-controller/0.log" Mar 18 17:35:14.879966 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:14.879931 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m9kqz_ee65363f-46b9-4194-8b09-6d6e1a39303b/ovn-acl-logging/0.log" Mar 18 17:35:14.905182 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:14.905143 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m9kqz_ee65363f-46b9-4194-8b09-6d6e1a39303b/kube-rbac-proxy-node/0.log" Mar 18 17:35:14.929095 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:14.929067 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m9kqz_ee65363f-46b9-4194-8b09-6d6e1a39303b/kube-rbac-proxy-ovn-metrics/0.log" Mar 18 17:35:14.953069 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:14.953041 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m9kqz_ee65363f-46b9-4194-8b09-6d6e1a39303b/northd/0.log" Mar 18 17:35:14.976893 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:14.976865 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m9kqz_ee65363f-46b9-4194-8b09-6d6e1a39303b/nbdb/0.log" Mar 18 17:35:15.000687 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:15.000654 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m9kqz_ee65363f-46b9-4194-8b09-6d6e1a39303b/sbdb/0.log" Mar 18 17:35:15.117364 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:15.117292 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m9kqz_ee65363f-46b9-4194-8b09-6d6e1a39303b/ovnkube-controller/0.log" Mar 18 17:35:16.613375 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:16.613340 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qv92n_3e4d4190-53f5-422e-ba62-0a231b728c8d/network-check-target-container/0.log" Mar 18 17:35:17.572445 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:17.572415 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-5nt27_91a1650a-2ef0-48b3-a7c3-38900240dde0/iptables-alerter/0.log" Mar 18 17:35:18.316633 ip-10-0-139-43 kubenswrapper[2571]: I0318 17:35:18.316608 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-nhkg8_ef75d54c-b5e4-43b6-a240-283b489f554e/tuned/0.log"