Apr 16 22:13:27.384891 ip-10-0-135-106 systemd[1]: Starting Kubernetes Kubelet... Apr 16 22:13:27.857739 ip-10-0-135-106 kubenswrapper[2562]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:27.857739 ip-10-0-135-106 kubenswrapper[2562]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 22:13:27.857739 ip-10-0-135-106 kubenswrapper[2562]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:27.857739 ip-10-0-135-106 kubenswrapper[2562]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 22:13:27.857739 ip-10-0-135-106 kubenswrapper[2562]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:27.860433 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.860287 2562 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 22:13:27.865420 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865402 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:27.865420 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865420 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865425 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865430 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865433 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865435 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865438 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865441 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865444 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865446 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865449 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865452 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865454 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865457 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865459 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865462 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865466 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865468 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865471 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865473 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:27.865487 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865476 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865478 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865481 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865484 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865486 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865489 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865491 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865494 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865497 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865500 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865503 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865505 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865508 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865511 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865513 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865516 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865519 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865521 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865523 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:27.865967 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865526 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865530 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865534 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865536 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865539 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865541 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865544 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865546 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865549 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865551 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865555 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865557 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865560 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865562 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865564 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865569 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865572 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865575 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865578 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865580 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:27.866433 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865583 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865586 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865589 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865593 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865597 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865614 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865619 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865623 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865626 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865629 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865632 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865634 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865637 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865640 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865642 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865645 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865648 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865652 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865655 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:27.866956 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865658 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865660 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865663 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865665 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865668 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865671 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865674 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.865676 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866093 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866099 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866102 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866105 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866108 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866110 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866113 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866116 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866118 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866121 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866124 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866127 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:27.867405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866129 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866132 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866134 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866137 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866139 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866142 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866144 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866148 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866152 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866155 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866158 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866160 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866162 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866165 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866168 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866170 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866174 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866176 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866182 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:27.867897 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866186 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866189 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866192 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866196 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866199 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866201 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866204 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866207 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866209 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866212 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866214 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866217 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866219 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866222 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866224 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866227 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866230 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866232 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866234 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:27.868355 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866237 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866239 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866242 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866244 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866247 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866249 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866252 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866254 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866257 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866260 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866263 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866269 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866272 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866275 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866278 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866280 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866283 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866285 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866288 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866290 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:27.868833 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866293 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866295 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866298 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866300 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866302 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866305 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866307 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866310 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866312 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866315 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866317 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866320 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866322 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866324 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866327 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.866330 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867076 2562 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867093 2562 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867102 2562 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867107 2562 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867111 2562 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 22:13:27.869325 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867115 2562 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867120 2562 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867125 2562 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867129 2562 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867133 2562 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867136 2562 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867140 2562 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867143 2562 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867146 2562 flags.go:64] FLAG: --cgroup-root="" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867149 2562 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867152 2562 flags.go:64] FLAG: --client-ca-file="" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867155 2562 flags.go:64] FLAG: --cloud-config="" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867158 2562 flags.go:64] FLAG: --cloud-provider="external" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867161 2562 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867165 2562 flags.go:64] FLAG: --cluster-domain="" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867169 2562 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867172 2562 flags.go:64] FLAG: --config-dir="" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867175 2562 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867178 2562 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867182 2562 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867185 2562 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867188 2562 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867191 2562 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867194 2562 flags.go:64] FLAG: --contention-profiling="false" Apr 16 22:13:27.869843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867197 2562 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867200 2562 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867203 2562 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867206 2562 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867210 2562 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867213 2562 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867217 2562 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867220 2562 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867223 2562 flags.go:64] FLAG: --enable-server="true" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867226 2562 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867231 2562 flags.go:64] FLAG: --event-burst="100" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867234 2562 flags.go:64] FLAG: --event-qps="50" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867238 2562 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867241 2562 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867244 2562 flags.go:64] FLAG: --eviction-hard="" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867248 2562 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867251 2562 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867254 2562 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867257 2562 flags.go:64] FLAG: --eviction-soft="" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867260 2562 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867263 2562 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867266 2562 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867269 2562 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867272 2562 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867275 2562 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 22:13:27.870415 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867278 2562 flags.go:64] FLAG: --feature-gates="" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867282 2562 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867285 2562 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867288 2562 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867291 2562 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867294 2562 flags.go:64] FLAG: --healthz-port="10248" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867297 2562 flags.go:64] FLAG: --help="false" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867300 2562 flags.go:64] FLAG: --hostname-override="ip-10-0-135-106.ec2.internal" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867303 2562 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867306 2562 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867309 2562 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867313 2562 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867316 2562 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867319 2562 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867322 2562 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867325 2562 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867328 2562 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867331 2562 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867334 2562 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867337 2562 flags.go:64] FLAG: --kube-reserved="" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867341 2562 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867344 2562 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867348 2562 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867350 2562 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 22:13:27.871152 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867353 2562 flags.go:64] FLAG: --lock-file="" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867356 2562 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867359 2562 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867362 2562 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867367 2562 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867371 2562 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867374 2562 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867376 2562 flags.go:64] FLAG: --logging-format="text" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867379 2562 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867383 2562 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867386 2562 flags.go:64] FLAG: --manifest-url="" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867389 2562 flags.go:64] FLAG: --manifest-url-header="" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867393 2562 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867396 2562 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867401 2562 flags.go:64] FLAG: --max-pods="110" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867404 2562 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867407 2562 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867410 2562 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867413 2562 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867416 2562 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867419 2562 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867423 2562 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867430 2562 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867433 2562 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867437 2562 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 22:13:27.871745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867440 2562 flags.go:64] FLAG: --pod-cidr="" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867443 2562 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867449 2562 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867451 2562 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867455 2562 flags.go:64] FLAG: --pods-per-core="0" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867458 2562 flags.go:64] FLAG: --port="10250" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867461 2562 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867464 2562 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0cc6e2ddec57dfc39" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867467 2562 flags.go:64] FLAG: --qos-reserved="" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867470 2562 flags.go:64] FLAG: --read-only-port="10255" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867473 2562 flags.go:64] FLAG: --register-node="true" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867476 2562 flags.go:64] FLAG: --register-schedulable="true" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867480 2562 flags.go:64] FLAG: --register-with-taints="" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867483 2562 flags.go:64] FLAG: --registry-burst="10" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867486 2562 flags.go:64] FLAG: --registry-qps="5" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867489 2562 flags.go:64] FLAG: --reserved-cpus="" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867492 2562 flags.go:64] FLAG: --reserved-memory="" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867496 2562 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867499 2562 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867502 2562 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867505 2562 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867508 2562 flags.go:64] FLAG: --runonce="false" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867514 2562 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867517 2562 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867520 2562 flags.go:64] FLAG: --seccomp-default="false" Apr 16 22:13:27.872386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867523 2562 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867527 2562 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867530 2562 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867533 2562 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867536 2562 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867539 2562 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867542 2562 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867545 2562 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867548 2562 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867551 2562 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867554 2562 flags.go:64] FLAG: --system-cgroups="" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867556 2562 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867563 2562 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867566 2562 flags.go:64] FLAG: --tls-cert-file="" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867569 2562 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867573 2562 flags.go:64] FLAG: --tls-min-version="" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867576 2562 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867579 2562 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867581 2562 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867585 2562 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867588 2562 flags.go:64] FLAG: --v="2" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867592 2562 flags.go:64] FLAG: --version="false" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867597 2562 flags.go:64] FLAG: --vmodule="" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867612 2562 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.867616 2562 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 22:13:27.873018 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867712 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867715 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867719 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867722 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867726 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867729 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867732 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867735 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867737 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867740 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867743 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867746 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867749 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867751 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867754 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867757 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867759 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867762 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867766 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867769 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:27.873665 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867772 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867775 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867778 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867780 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867783 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867786 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867789 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867793 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867796 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867799 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867802 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867805 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867807 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867810 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867813 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867816 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867820 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867823 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867825 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:27.874189 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867828 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867830 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867833 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867836 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867839 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867841 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867844 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867846 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867849 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867852 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867854 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867859 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867861 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867864 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867867 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867869 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867872 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867874 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867877 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867879 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:27.874884 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867882 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867884 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867887 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867889 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867891 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867894 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867897 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867899 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867902 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867906 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867909 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867912 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867914 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867917 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867920 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867922 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867926 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867929 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867932 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:27.875686 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867935 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:27.876198 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867938 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:27.876198 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867940 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:27.876198 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867943 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:27.876198 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867946 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:27.876198 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867949 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:27.876198 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867951 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:27.876198 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.867955 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:27.876198 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.868565 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:27.876405 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.876312 2562 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 22:13:27.876405 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.876330 2562 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 22:13:27.876405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876385 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:27.876405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876390 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:27.876405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876393 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:27.876405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876396 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:27.876405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876400 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:27.876405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876403 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:27.876405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876406 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:27.876405 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876409 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876412 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876415 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876418 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876421 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876423 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876426 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876428 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876431 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876434 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876437 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876440 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876443 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876446 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876448 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876451 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876454 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876456 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876459 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876461 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:27.876675 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876464 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876466 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876469 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876471 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876474 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876476 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876479 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876481 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876484 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876486 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876489 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876491 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876494 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876496 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876500 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876502 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876505 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876508 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876510 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876513 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:27.877178 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876516 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876518 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876521 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876524 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876526 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876529 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876532 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876534 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876537 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876539 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876541 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876544 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876547 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876549 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876552 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876554 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876557 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876559 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876562 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:27.877722 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876564 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876569 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876573 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876576 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876579 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876581 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876584 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876587 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876590 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876593 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876597 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876615 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876619 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876623 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876627 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876631 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876635 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876640 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876646 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:27.878187 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876649 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:27.878667 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.876654 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:27.878667 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876761 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:27.878667 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876767 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:27.878667 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876770 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:27.878667 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876773 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:27.878667 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876776 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:27.878667 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876778 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:27.878667 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876781 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:27.878667 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876784 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:27.878667 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876787 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:27.878667 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876789 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:27.878667 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876792 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:27.878667 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876795 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:27.878667 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876798 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:27.878667 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876801 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:27.878667 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876803 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876806 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876809 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876812 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876815 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876818 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876821 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876823 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876834 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876837 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876840 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876843 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876845 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876848 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876851 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876853 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876856 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876858 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876861 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876863 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:27.879062 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876866 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876868 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876871 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876873 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876876 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876878 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876881 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876884 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876886 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876889 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876892 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876895 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876897 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876900 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876902 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876905 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876907 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876910 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876914 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:27.879551 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876918 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876921 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876925 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876928 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876931 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876934 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876937 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876940 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876942 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876945 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876948 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876950 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876953 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876955 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876958 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876960 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876964 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876967 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876969 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876972 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:27.880031 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876975 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:27.880512 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876978 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:27.880512 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876980 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:27.880512 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876983 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:27.880512 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876985 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:27.880512 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876988 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:27.880512 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876990 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:27.880512 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876993 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:27.880512 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876995 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:27.880512 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.876998 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:27.880512 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.877001 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:27.880512 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.877003 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:27.880512 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:27.877006 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:27.880512 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.877011 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:27.880512 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.877708 2562 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 22:13:27.880972 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.880957 2562 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 22:13:27.881929 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.881917 2562 server.go:1019] "Starting client certificate rotation" Apr 16 22:13:27.882036 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.882015 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:27.882070 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.882053 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:27.909616 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.909589 2562 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:27.912480 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.912464 2562 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:27.930899 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.930872 2562 log.go:25] "Validated CRI v1 runtime API" Apr 16 22:13:27.937480 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.937459 2562 log.go:25] "Validated CRI v1 image API" Apr 16 22:13:27.938836 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.938817 2562 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 22:13:27.942394 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.942374 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:27.943104 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.943084 2562 fs.go:135] Filesystem UUIDs: map[3acfc8bd-ba61-4427-b1b7-f00223048640:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 9229e540-e094-415d-8969-a30d53853c70:/dev/nvme0n1p4] Apr 16 22:13:27.943169 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.943105 2562 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 22:13:27.950738 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.950611 2562 manager.go:217] Machine: {Timestamp:2026-04-16 22:13:27.947808314 +0000 UTC m=+0.430327444 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101514 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c02f50c74196f3e765e90bf2ea368 SystemUUID:ec2c02f5-0c74-196f-3e76-5e90bf2ea368 BootID:642d7682-86e2-4641-8ca0-20234a897023 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6b:ee:d3:3f:a9 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6b:ee:d3:3f:a9 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:de:8e:a2:84:f7:49 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 22:13:27.951275 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.951264 2562 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 22:13:27.951379 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.951367 2562 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 22:13:27.951810 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.951779 2562 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 22:13:27.951954 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.951813 2562 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-106.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 22:13:27.952000 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.951964 2562 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 22:13:27.952000 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.951972 2562 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 22:13:27.952000 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.951986 2562 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:27.953588 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.953575 2562 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:27.955268 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.955256 2562 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:27.955401 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.955391 2562 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 22:13:27.958268 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.958256 2562 kubelet.go:491] "Attempting to sync node with API server" Apr 16 22:13:27.958306 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.958273 2562 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 22:13:27.958306 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.958290 2562 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 22:13:27.958306 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.958301 2562 kubelet.go:397] "Adding apiserver pod source" Apr 16 22:13:27.958442 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.958310 2562 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 22:13:27.960244 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.960226 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:27.960244 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.960246 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:27.963640 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.963624 2562 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 22:13:27.964950 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.964935 2562 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 22:13:27.966576 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.966557 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 22:13:27.966656 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.966580 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 22:13:27.966656 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.966587 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 22:13:27.966656 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.966593 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 22:13:27.966656 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.966599 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 22:13:27.966656 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.966621 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 22:13:27.966656 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.966640 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 22:13:27.966656 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.966647 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 22:13:27.966656 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.966653 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 22:13:27.966656 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.966659 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 22:13:27.966903 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.966672 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 22:13:27.966903 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.966683 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 22:13:27.968642 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.968623 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 22:13:27.968684 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.968643 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 22:13:27.972179 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.972155 2562 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-106.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 22:13:27.972281 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:27.972216 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-106.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 22:13:27.972281 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:27.972240 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 22:13:27.972281 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.972268 2562 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 22:13:27.972408 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.972307 2562 server.go:1295] "Started kubelet" Apr 16 22:13:27.972439 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.972387 2562 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 22:13:27.972467 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.972395 2562 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 22:13:27.972501 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.972486 2562 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 22:13:27.973137 ip-10-0-135-106 systemd[1]: Started Kubernetes Kubelet. Apr 16 22:13:27.973463 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.973445 2562 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 22:13:27.975559 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.975544 2562 server.go:317] "Adding debug handlers to kubelet server" Apr 16 22:13:27.979048 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:27.977465 2562 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-106.ec2.internal.18a6f605d62553db default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-106.ec2.internal,UID:ip-10-0-135-106.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-106.ec2.internal,},FirstTimestamp:2026-04-16 22:13:27.972279259 +0000 UTC m=+0.454798390,LastTimestamp:2026-04-16 22:13:27.972279259 +0000 UTC m=+0.454798390,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-106.ec2.internal,}" Apr 16 22:13:27.979374 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.979349 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:27.979863 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.979840 2562 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 22:13:27.981277 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.981251 2562 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 22:13:27.981277 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.981270 2562 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 22:13:27.981422 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.981357 2562 factory.go:153] Registering CRI-O factory Apr 16 22:13:27.981422 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.981374 2562 factory.go:223] Registration of the crio container factory successfully Apr 16 22:13:27.981534 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.981449 2562 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 22:13:27.981534 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.981455 2562 reconstruct.go:97] "Volume reconstruction finished" Apr 16 22:13:27.981534 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.981459 2562 factory.go:55] Registering systemd factory Apr 16 22:13:27.981534 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.981464 2562 reconciler.go:26] "Reconciler: start to sync state" Apr 16 22:13:27.981534 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.981468 2562 factory.go:223] Registration of the systemd container factory successfully Apr 16 22:13:27.981534 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.981491 2562 factory.go:103] Registering Raw factory Apr 16 22:13:27.981534 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.981505 2562 manager.go:1196] Started watching for new ooms in manager Apr 16 22:13:27.981534 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.981464 2562 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 22:13:27.982031 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:27.981930 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 16 22:13:27.983648 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.983616 2562 manager.go:319] Starting recovery of all containers Apr 16 22:13:27.983819 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:27.983783 2562 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 22:13:27.991814 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:27.991775 2562 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-106.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 22:13:27.992121 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:27.992096 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 22:13:27.996004 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:27.995827 2562 manager.go:324] Recovery completed Apr 16 22:13:28.000594 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.000580 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:28.003655 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.003631 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:28.003739 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.003668 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:28.003739 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.003680 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:28.004272 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.004248 2562 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 22:13:28.004272 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.004261 2562 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 22:13:28.004272 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.004276 2562 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:28.005986 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:28.005905 2562 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-106.ec2.internal.18a6f605d80413be default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-106.ec2.internal,UID:ip-10-0-135-106.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-135-106.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-135-106.ec2.internal,},FirstTimestamp:2026-04-16 22:13:28.00365459 +0000 UTC m=+0.486173715,LastTimestamp:2026-04-16 22:13:28.00365459 +0000 UTC m=+0.486173715,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-106.ec2.internal,}" Apr 16 22:13:28.006703 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.006688 2562 policy_none.go:49] "None policy: Start" Apr 16 22:13:28.006777 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.006710 2562 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 22:13:28.006777 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.006725 2562 state_mem.go:35] "Initializing new in-memory state store" Apr 16 22:13:28.014333 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.014312 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vz49p" Apr 16 22:13:28.018222 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:28.018152 2562 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-106.ec2.internal.18a6f605d8045da9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-106.ec2.internal,UID:ip-10-0-135-106.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-135-106.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-135-106.ec2.internal,},FirstTimestamp:2026-04-16 22:13:28.003673513 +0000 UTC m=+0.486192638,LastTimestamp:2026-04-16 22:13:28.003673513 +0000 UTC m=+0.486192638,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-106.ec2.internal,}" Apr 16 22:13:28.022332 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.022312 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vz49p" Apr 16 22:13:28.044267 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.044246 2562 manager.go:341] "Starting Device Plugin manager" Apr 16 22:13:28.055544 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:28.044291 2562 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 22:13:28.055544 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.044303 2562 server.go:85] "Starting device plugin registration server" Apr 16 22:13:28.055544 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.044574 2562 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 22:13:28.055544 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.044584 2562 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 22:13:28.055544 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.044783 2562 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 22:13:28.055544 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.044851 2562 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 22:13:28.055544 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.044857 2562 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 22:13:28.055544 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:28.045495 2562 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 22:13:28.055544 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:28.045533 2562 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-106.ec2.internal\" not found" Apr 16 22:13:28.123985 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.123899 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 22:13:28.125214 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.125191 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 22:13:28.125284 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.125232 2562 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 22:13:28.125284 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.125258 2562 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 22:13:28.125284 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.125269 2562 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 22:13:28.125408 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:28.125328 2562 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 22:13:28.127988 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.127963 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:28.145035 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.145007 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:28.148038 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.148018 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:28.148150 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.148053 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:28.148150 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.148067 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:28.148150 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.148095 2562 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-106.ec2.internal" Apr 16 22:13:28.156506 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.156482 2562 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-106.ec2.internal" Apr 16 22:13:28.156506 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:28.156509 2562 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-106.ec2.internal\": node \"ip-10-0-135-106.ec2.internal\" not found" Apr 16 22:13:28.168094 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:28.168066 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 16 22:13:28.225710 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.225667 2562 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal"] Apr 16 22:13:28.225864 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.225760 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:28.228019 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.228002 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:28.228092 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.228035 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:28.228092 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.228045 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:28.229081 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.229069 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:28.229217 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.229202 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" Apr 16 22:13:28.229254 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.229232 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:28.232245 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.232225 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:28.232356 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.232234 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:28.232356 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.232278 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:28.232356 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.232288 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:28.232356 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.232261 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:28.232356 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.232352 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:28.233451 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.233437 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal" Apr 16 22:13:28.233526 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.233463 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:28.234280 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.234263 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:28.234360 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.234290 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:28.234360 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.234303 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:28.258382 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:28.258360 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-106.ec2.internal\" not found" node="ip-10-0-135-106.ec2.internal" Apr 16 22:13:28.262959 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:28.262938 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-106.ec2.internal\" not found" node="ip-10-0-135-106.ec2.internal" Apr 16 22:13:28.268135 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:28.268118 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 16 22:13:28.369060 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:28.369028 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 16 22:13:28.383522 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.383452 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8d71cd7329eab1e79f952bebf6a7f77b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal\" (UID: \"8d71cd7329eab1e79f952bebf6a7f77b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" Apr 16 22:13:28.383522 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.383492 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d71cd7329eab1e79f952bebf6a7f77b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal\" (UID: \"8d71cd7329eab1e79f952bebf6a7f77b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" Apr 16 22:13:28.383522 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.383520 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/477d8d5a68adc15182b0ab0c3cde7f73-config\") pod \"kube-apiserver-proxy-ip-10-0-135-106.ec2.internal\" (UID: \"477d8d5a68adc15182b0ab0c3cde7f73\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal" Apr 16 22:13:28.469905 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:28.469849 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 16 22:13:28.484310 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.484284 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/477d8d5a68adc15182b0ab0c3cde7f73-config\") pod \"kube-apiserver-proxy-ip-10-0-135-106.ec2.internal\" (UID: \"477d8d5a68adc15182b0ab0c3cde7f73\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal" Apr 16 22:13:28.484402 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.484313 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8d71cd7329eab1e79f952bebf6a7f77b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal\" (UID: \"8d71cd7329eab1e79f952bebf6a7f77b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" Apr 16 22:13:28.484402 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.484331 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d71cd7329eab1e79f952bebf6a7f77b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal\" (UID: \"8d71cd7329eab1e79f952bebf6a7f77b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" Apr 16 22:13:28.484467 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.484395 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8d71cd7329eab1e79f952bebf6a7f77b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal\" (UID: \"8d71cd7329eab1e79f952bebf6a7f77b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" Apr 16 22:13:28.484467 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.484394 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/477d8d5a68adc15182b0ab0c3cde7f73-config\") pod \"kube-apiserver-proxy-ip-10-0-135-106.ec2.internal\" (UID: \"477d8d5a68adc15182b0ab0c3cde7f73\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal" Apr 16 22:13:28.484467 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.484432 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d71cd7329eab1e79f952bebf6a7f77b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal\" (UID: \"8d71cd7329eab1e79f952bebf6a7f77b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" Apr 16 22:13:28.560476 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.560434 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" Apr 16 22:13:28.565878 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.565858 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal" Apr 16 22:13:28.570227 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:28.570205 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 16 22:13:28.670956 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:28.670865 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 16 22:13:28.771470 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:28.771429 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 16 22:13:28.872274 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:28.872235 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 16 22:13:28.882614 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.882580 2562 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 22:13:28.882750 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.882735 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:28.972419 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:28.972382 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 16 22:13:28.979639 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.979592 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:28.990538 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:28.990510 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:29.013540 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.013511 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-kf4vg" Apr 16 22:13:29.021410 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.021385 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-kf4vg" Apr 16 22:13:29.024449 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.024423 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 22:08:28 +0000 UTC" deadline="2027-10-01 07:50:57.593384968 +0000 UTC" Apr 16 22:13:29.024449 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.024449 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12777h37m28.568938607s" Apr 16 22:13:29.055091 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:29.055045 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod477d8d5a68adc15182b0ab0c3cde7f73.slice/crio-5e7ee0d2a995b5fa42fce62a1f23726549a9accda91a990a7f7edf2ac2237c81 WatchSource:0}: Error finding container 5e7ee0d2a995b5fa42fce62a1f23726549a9accda91a990a7f7edf2ac2237c81: Status 404 returned error can't find the container with id 5e7ee0d2a995b5fa42fce62a1f23726549a9accda91a990a7f7edf2ac2237c81 Apr 16 22:13:29.055352 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:29.055327 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d71cd7329eab1e79f952bebf6a7f77b.slice/crio-f48464cb9dcb24e07058c888ec7755c19aa4322acc0f8c2b013207de1b10d2f9 WatchSource:0}: Error finding container f48464cb9dcb24e07058c888ec7755c19aa4322acc0f8c2b013207de1b10d2f9: Status 404 returned error can't find the container with id f48464cb9dcb24e07058c888ec7755c19aa4322acc0f8c2b013207de1b10d2f9 Apr 16 22:13:29.058379 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.058361 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:13:29.059570 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.059549 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:29.073545 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:29.073499 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 16 22:13:29.129035 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.128977 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" event={"ID":"8d71cd7329eab1e79f952bebf6a7f77b","Type":"ContainerStarted","Data":"f48464cb9dcb24e07058c888ec7755c19aa4322acc0f8c2b013207de1b10d2f9"} Apr 16 22:13:29.130021 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.129990 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal" event={"ID":"477d8d5a68adc15182b0ab0c3cde7f73","Type":"ContainerStarted","Data":"5e7ee0d2a995b5fa42fce62a1f23726549a9accda91a990a7f7edf2ac2237c81"} Apr 16 22:13:29.174232 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:29.174161 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-106.ec2.internal\" not found" Apr 16 22:13:29.246325 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.246293 2562 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:29.264641 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.264597 2562 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:29.280427 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.280400 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" Apr 16 22:13:29.292817 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.292801 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:29.294016 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.294001 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal" Apr 16 22:13:29.299759 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.299742 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:29.959713 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.959672 2562 apiserver.go:52] "Watching apiserver" Apr 16 22:13:29.965799 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.965767 2562 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 22:13:29.967890 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.967856 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-4jzbz","openshift-network-diagnostics/network-check-target-krggd","openshift-network-operator/iptables-alerter-zshz5","openshift-ovn-kubernetes/ovnkube-node-fwm7d","kube-system/konnectivity-agent-8nkjl","kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r","openshift-dns/node-resolver-l2b5r","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal","openshift-multus/multus-additional-cni-plugins-6f9g5","openshift-multus/multus-qcw82","openshift-multus/network-metrics-daemon-4zqvj","openshift-cluster-node-tuning-operator/tuned-c7wd4"] Apr 16 22:13:29.969962 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.969924 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:29.971138 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.971113 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:29.971261 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:29.971202 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-krggd" podUID="d78565aa-9f67-4043-a21a-fe0e9c37b4c3" Apr 16 22:13:29.972146 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.972116 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 22:13:29.972261 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.972148 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 22:13:29.972261 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.972155 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-vmjgr\"" Apr 16 22:13:29.972507 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.972488 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 22:13:29.973448 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.973426 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zshz5" Apr 16 22:13:29.974869 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.974845 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8nkjl" Apr 16 22:13:29.975807 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.975785 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:29.975937 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.975853 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:29.975937 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.975792 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:29.976055 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.976017 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.976661 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.976399 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-rh7fg\"" Apr 16 22:13:29.976661 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.976553 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 22:13:29.978693 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.977531 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4jzbz" Apr 16 22:13:29.978693 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.977884 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 22:13:29.978693 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.977938 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dpgrh\"" Apr 16 22:13:29.978693 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.978133 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 22:13:29.978934 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.978887 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 22:13:29.979561 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.979540 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 22:13:29.979903 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.979834 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 22:13:29.979903 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.979854 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 22:13:29.980008 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.979909 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 22:13:29.980170 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.980144 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 22:13:29.980263 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.980170 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-swkrs\"" Apr 16 22:13:29.980263 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.980245 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 22:13:29.980498 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.980479 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 22:13:29.980575 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.980555 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-prx6s\"" Apr 16 22:13:29.980860 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.980841 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 22:13:29.980945 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.980883 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l2b5r" Apr 16 22:13:29.981929 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.981908 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:29.983040 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.983020 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qcw82" Apr 16 22:13:29.983345 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.983247 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 22:13:29.983434 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.983348 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 22:13:29.983642 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.983586 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-k442x\"" Apr 16 22:13:29.983901 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.983879 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 22:13:29.984104 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.984088 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 22:13:29.984178 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.984121 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 22:13:29.984278 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.984259 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-tlmhv\"" Apr 16 22:13:29.984346 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.984308 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 22:13:29.984423 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.984407 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 22:13:29.984541 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.984525 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:29.984680 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:29.984593 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zqvj" podUID="ef0b8b85-4299-4164-b2f4-ae06377db331" Apr 16 22:13:29.984920 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.984902 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-7v4r4\"" Apr 16 22:13:29.984920 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.984912 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 22:13:29.985596 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.985577 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:29.987306 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.987278 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-b9t9q\"" Apr 16 22:13:29.987447 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.987429 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:29.987524 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.987483 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:29.991797 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.991770 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-system-cni-dir\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:29.991909 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.991816 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-etc-kubernetes\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:29.991909 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.991841 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-slash\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.991909 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.991862 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-run-ovn\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.991909 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.991884 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-multus-cni-dir\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:29.992096 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.991936 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9775f627-c910-402f-bb3d-4d424e2d7968-etc-selinux\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:29.992096 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.991973 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.992096 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992000 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c068e6b8-2d0c-45f9-a80f-87d043b56b89-ovnkube-script-lib\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.992096 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992034 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:29.992096 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992063 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9775f627-c910-402f-bb3d-4d424e2d7968-socket-dir\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:29.992096 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992089 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7073fe4e-234e-40a7-b337-e387d20bd403-agent-certs\") pod \"konnectivity-agent-8nkjl\" (UID: \"7073fe4e-234e-40a7-b337-e387d20bd403\") " pod="kube-system/konnectivity-agent-8nkjl" Apr 16 22:13:29.992345 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992123 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-multus-socket-dir-parent\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:29.992345 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992157 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9775f627-c910-402f-bb3d-4d424e2d7968-registration-dir\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:29.992345 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992213 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d260be2e-0541-4595-95d1-cf52b077b22b-host\") pod \"node-ca-4jzbz\" (UID: \"d260be2e-0541-4595-95d1-cf52b077b22b\") " pod="openshift-image-registry/node-ca-4jzbz" Apr 16 22:13:29.992345 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992308 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c19bbb1b-93c9-40fc-9dc8-bc5463213a6d-tmp-dir\") pod \"node-resolver-l2b5r\" (UID: \"c19bbb1b-93c9-40fc-9dc8-bc5463213a6d\") " pod="openshift-dns/node-resolver-l2b5r" Apr 16 22:13:29.992482 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992355 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-cnibin\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:29.992482 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992387 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-host-run-netns\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:29.992482 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992410 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-host-var-lib-cni-multus\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:29.992482 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992435 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-hostroot\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:29.992482 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992456 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-run-netns\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.992717 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992482 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q78s\" (UniqueName: \"kubernetes.io/projected/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-kube-api-access-5q78s\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:29.992717 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992506 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-run-openvswitch\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.992717 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992522 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-cnibin\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:29.992717 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992549 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-var-lib-openvswitch\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.992717 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992588 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-os-release\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:29.992717 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992634 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-os-release\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:29.992717 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992670 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-host-var-lib-kubelet\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:29.992717 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992695 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-multus-conf-dir\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:29.993057 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992735 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-log-socket\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.993057 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992790 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpwzs\" (UniqueName: \"kubernetes.io/projected/c068e6b8-2d0c-45f9-a80f-87d043b56b89-kube-api-access-qpwzs\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.993057 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992836 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1-host-slash\") pod \"iptables-alerter-zshz5\" (UID: \"8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1\") " pod="openshift-network-operator/iptables-alerter-zshz5" Apr 16 22:13:29.993057 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992867 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c19bbb1b-93c9-40fc-9dc8-bc5463213a6d-hosts-file\") pod \"node-resolver-l2b5r\" (UID: \"c19bbb1b-93c9-40fc-9dc8-bc5463213a6d\") " pod="openshift-dns/node-resolver-l2b5r" Apr 16 22:13:29.993057 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992891 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bff6b952-968b-4fe3-a43f-333dead963bc-cni-binary-copy\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:29.993057 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992922 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7073fe4e-234e-40a7-b337-e387d20bd403-konnectivity-ca\") pod \"konnectivity-agent-8nkjl\" (UID: \"7073fe4e-234e-40a7-b337-e387d20bd403\") " pod="kube-system/konnectivity-agent-8nkjl" Apr 16 22:13:29.993057 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992947 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krnhw\" (UniqueName: \"kubernetes.io/projected/c19bbb1b-93c9-40fc-9dc8-bc5463213a6d-kube-api-access-krnhw\") pod \"node-resolver-l2b5r\" (UID: \"c19bbb1b-93c9-40fc-9dc8-bc5463213a6d\") " pod="openshift-dns/node-resolver-l2b5r" Apr 16 22:13:29.993057 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992968 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bff6b952-968b-4fe3-a43f-333dead963bc-multus-daemon-config\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:29.993057 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992983 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-cni-bin\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.993057 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.992998 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-cni-netd\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.993057 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993019 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c068e6b8-2d0c-45f9-a80f-87d043b56b89-ovn-node-metrics-cert\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.993057 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993058 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrtmf\" (UniqueName: \"kubernetes.io/projected/d260be2e-0541-4595-95d1-cf52b077b22b-kube-api-access-xrtmf\") pod \"node-ca-4jzbz\" (UID: \"d260be2e-0541-4595-95d1-cf52b077b22b\") " pod="openshift-image-registry/node-ca-4jzbz" Apr 16 22:13:29.993526 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993091 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1-iptables-alerter-script\") pod \"iptables-alerter-zshz5\" (UID: \"8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1\") " pod="openshift-network-operator/iptables-alerter-zshz5" Apr 16 22:13:29.993526 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993121 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:29.993526 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993146 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-system-cni-dir\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:29.993526 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993175 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-host-var-lib-cni-bin\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:29.993526 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993211 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9775f627-c910-402f-bb3d-4d424e2d7968-device-dir\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:29.993526 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993239 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-cni-binary-copy\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:29.993526 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993274 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9775f627-c910-402f-bb3d-4d424e2d7968-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:29.993526 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993323 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb2kr\" (UniqueName: \"kubernetes.io/projected/9775f627-c910-402f-bb3d-4d424e2d7968-kube-api-access-pb2kr\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:29.993526 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993357 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-run-systemd\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.993526 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993384 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-node-log\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.993526 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993408 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-run-ovn-kubernetes\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.993526 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993431 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c068e6b8-2d0c-45f9-a80f-87d043b56b89-env-overrides\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.993526 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993469 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmgfb\" (UniqueName: \"kubernetes.io/projected/8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1-kube-api-access-tmgfb\") pod \"iptables-alerter-zshz5\" (UID: \"8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1\") " pod="openshift-network-operator/iptables-alerter-zshz5" Apr 16 22:13:29.993526 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993496 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-host-run-k8s-cni-cncf-io\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:29.993526 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993524 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-host-run-multus-certs\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:29.994112 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993552 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-etc-openvswitch\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.994112 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993598 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq2w2\" (UniqueName: \"kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2\") pod \"network-check-target-krggd\" (UID: \"d78565aa-9f67-4043-a21a-fe0e9c37b4c3\") " pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:29.994112 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993636 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-kubelet\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.994112 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993657 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-systemd-units\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.994112 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993672 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c068e6b8-2d0c-45f9-a80f-87d043b56b89-ovnkube-config\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:29.994112 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993686 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d260be2e-0541-4595-95d1-cf52b077b22b-serviceca\") pod \"node-ca-4jzbz\" (UID: \"d260be2e-0541-4595-95d1-cf52b077b22b\") " pod="openshift-image-registry/node-ca-4jzbz" Apr 16 22:13:29.994112 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993709 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:29.994112 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993735 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smvrw\" (UniqueName: \"kubernetes.io/projected/bff6b952-968b-4fe3-a43f-333dead963bc-kube-api-access-smvrw\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:29.994112 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:29.993759 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9775f627-c910-402f-bb3d-4d424e2d7968-sys-fs\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:30.022347 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.022308 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:29 +0000 UTC" deadline="2027-10-19 03:54:43.186955913 +0000 UTC" Apr 16 22:13:30.022347 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.022346 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13205h41m13.164614794s" Apr 16 22:13:30.082850 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.082815 2562 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 22:13:30.093989 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.093958 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-cnibin\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:30.094165 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.093997 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-var-lib-openvswitch\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.094165 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094032 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-sys\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.094165 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094053 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-lib-modules\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.094165 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094078 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-os-release\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:30.094165 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094089 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-cnibin\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:30.094165 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094105 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-os-release\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.094165 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094089 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-var-lib-openvswitch\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.094165 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094131 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-host-var-lib-kubelet\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.094583 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094185 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-os-release\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:30.094583 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094200 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-host-var-lib-kubelet\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.094583 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094213 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-os-release\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.094583 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094251 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-multus-conf-dir\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.094583 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094295 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-log-socket\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.094583 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094319 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpwzs\" (UniqueName: \"kubernetes.io/projected/c068e6b8-2d0c-45f9-a80f-87d043b56b89-kube-api-access-qpwzs\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.094583 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094346 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-tuned\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.094583 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094352 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-multus-conf-dir\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.094583 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094370 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1-host-slash\") pod \"iptables-alerter-zshz5\" (UID: \"8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1\") " pod="openshift-network-operator/iptables-alerter-zshz5" Apr 16 22:13:30.094583 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094367 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-log-socket\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.094583 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094404 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c19bbb1b-93c9-40fc-9dc8-bc5463213a6d-hosts-file\") pod \"node-resolver-l2b5r\" (UID: \"c19bbb1b-93c9-40fc-9dc8-bc5463213a6d\") " pod="openshift-dns/node-resolver-l2b5r" Apr 16 22:13:30.094583 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094428 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bff6b952-968b-4fe3-a43f-333dead963bc-cni-binary-copy\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.094583 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094431 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1-host-slash\") pod \"iptables-alerter-zshz5\" (UID: \"8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1\") " pod="openshift-network-operator/iptables-alerter-zshz5" Apr 16 22:13:30.094583 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094450 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7073fe4e-234e-40a7-b337-e387d20bd403-konnectivity-ca\") pod \"konnectivity-agent-8nkjl\" (UID: \"7073fe4e-234e-40a7-b337-e387d20bd403\") " pod="kube-system/konnectivity-agent-8nkjl" Apr 16 22:13:30.094583 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094473 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krnhw\" (UniqueName: \"kubernetes.io/projected/c19bbb1b-93c9-40fc-9dc8-bc5463213a6d-kube-api-access-krnhw\") pod \"node-resolver-l2b5r\" (UID: \"c19bbb1b-93c9-40fc-9dc8-bc5463213a6d\") " pod="openshift-dns/node-resolver-l2b5r" Apr 16 22:13:30.094583 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094481 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c19bbb1b-93c9-40fc-9dc8-bc5463213a6d-hosts-file\") pod \"node-resolver-l2b5r\" (UID: \"c19bbb1b-93c9-40fc-9dc8-bc5463213a6d\") " pod="openshift-dns/node-resolver-l2b5r" Apr 16 22:13:30.094583 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094497 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bff6b952-968b-4fe3-a43f-333dead963bc-multus-daemon-config\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.094583 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094520 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-cni-bin\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.095197 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094543 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-cni-netd\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.095197 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094566 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c068e6b8-2d0c-45f9-a80f-87d043b56b89-ovn-node-metrics-cert\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.095197 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094598 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrtmf\" (UniqueName: \"kubernetes.io/projected/d260be2e-0541-4595-95d1-cf52b077b22b-kube-api-access-xrtmf\") pod \"node-ca-4jzbz\" (UID: \"d260be2e-0541-4595-95d1-cf52b077b22b\") " pod="openshift-image-registry/node-ca-4jzbz" Apr 16 22:13:30.095197 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094677 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-sysconfig\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.095197 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094690 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-cni-bin\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.095197 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094707 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1-iptables-alerter-script\") pod \"iptables-alerter-zshz5\" (UID: \"8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1\") " pod="openshift-network-operator/iptables-alerter-zshz5" Apr 16 22:13:30.095197 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094735 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:30.095197 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094763 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-system-cni-dir\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.095197 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094787 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-host-var-lib-cni-bin\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.095197 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094810 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9775f627-c910-402f-bb3d-4d424e2d7968-device-dir\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:30.095197 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094835 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hktlt\" (UniqueName: \"kubernetes.io/projected/6d656b01-098b-4cc0-81bb-66f2e3de7643-kube-api-access-hktlt\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.095197 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094864 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-cni-binary-copy\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:30.095197 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094888 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9775f627-c910-402f-bb3d-4d424e2d7968-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:30.095197 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094916 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb2kr\" (UniqueName: \"kubernetes.io/projected/9775f627-c910-402f-bb3d-4d424e2d7968-kube-api-access-pb2kr\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:30.095197 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094940 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-run-systemd\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.095197 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094964 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-node-log\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.095197 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094987 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-run-ovn-kubernetes\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.095750 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095011 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c068e6b8-2d0c-45f9-a80f-87d043b56b89-env-overrides\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.095750 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095039 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-sysctl-conf\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.095750 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095066 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmgfb\" (UniqueName: \"kubernetes.io/projected/8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1-kube-api-access-tmgfb\") pod \"iptables-alerter-zshz5\" (UID: \"8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1\") " pod="openshift-network-operator/iptables-alerter-zshz5" Apr 16 22:13:30.095750 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095097 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7073fe4e-234e-40a7-b337-e387d20bd403-konnectivity-ca\") pod \"konnectivity-agent-8nkjl\" (UID: \"7073fe4e-234e-40a7-b337-e387d20bd403\") " pod="kube-system/konnectivity-agent-8nkjl" Apr 16 22:13:30.095750 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095105 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-host-run-k8s-cni-cncf-io\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.095750 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095138 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-host-run-multus-certs\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.095750 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095156 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-etc-openvswitch\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.095750 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095167 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9775f627-c910-402f-bb3d-4d424e2d7968-device-dir\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:30.095750 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095175 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d656b01-098b-4cc0-81bb-66f2e3de7643-tmp\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.095750 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095184 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9775f627-c910-402f-bb3d-4d424e2d7968-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:30.095750 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095198 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fq2w2\" (UniqueName: \"kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2\") pod \"network-check-target-krggd\" (UID: \"d78565aa-9f67-4043-a21a-fe0e9c37b4c3\") " pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:30.095750 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095214 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-kubelet\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.095750 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095230 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-systemd-units\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.095750 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095248 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c068e6b8-2d0c-45f9-a80f-87d043b56b89-ovnkube-config\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.095750 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095266 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d260be2e-0541-4595-95d1-cf52b077b22b-serviceca\") pod \"node-ca-4jzbz\" (UID: \"d260be2e-0541-4595-95d1-cf52b077b22b\") " pod="openshift-image-registry/node-ca-4jzbz" Apr 16 22:13:30.095750 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095285 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch4fd\" (UniqueName: \"kubernetes.io/projected/ef0b8b85-4299-4164-b2f4-ae06377db331-kube-api-access-ch4fd\") pod \"network-metrics-daemon-4zqvj\" (UID: \"ef0b8b85-4299-4164-b2f4-ae06377db331\") " pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:30.095750 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095305 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:30.096392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095321 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smvrw\" (UniqueName: \"kubernetes.io/projected/bff6b952-968b-4fe3-a43f-333dead963bc-kube-api-access-smvrw\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.096392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095338 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9775f627-c910-402f-bb3d-4d424e2d7968-sys-fs\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:30.096392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095353 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-run\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.096392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095355 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-run-ovn-kubernetes\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.096392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095369 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-system-cni-dir\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:30.096392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095386 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-etc-kubernetes\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.096392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095404 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-slash\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.096392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095421 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-run-ovn\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.096392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095444 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-multus-cni-dir\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.096392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095479 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-host-run-k8s-cni-cncf-io\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.096392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095104 2562 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 22:13:30.096392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095489 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-run-systemd\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.096392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095631 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-multus-cni-dir\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.096392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095737 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:30.096392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095740 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1-iptables-alerter-script\") pod \"iptables-alerter-zshz5\" (UID: \"8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1\") " pod="openshift-network-operator/iptables-alerter-zshz5" Apr 16 22:13:30.096392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095783 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-host-var-lib-cni-bin\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.096392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095807 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c068e6b8-2d0c-45f9-a80f-87d043b56b89-env-overrides\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.096392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095830 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9775f627-c910-402f-bb3d-4d424e2d7968-etc-selinux\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:30.097237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095864 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.097237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095912 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c068e6b8-2d0c-45f9-a80f-87d043b56b89-ovnkube-script-lib\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.097237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095941 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-kubernetes\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.097237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095966 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-systemd\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.097237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096007 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9775f627-c910-402f-bb3d-4d424e2d7968-sys-fs\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:30.097237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096015 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.097237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096018 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:30.097237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095319 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-node-log\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.097237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096050 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-system-cni-dir\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.097237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.095105 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bff6b952-968b-4fe3-a43f-333dead963bc-multus-daemon-config\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.097237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096095 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-etc-kubernetes\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.097237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096109 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-host-run-multus-certs\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.097237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.094734 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-cni-netd\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.097237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096112 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9775f627-c910-402f-bb3d-4d424e2d7968-socket-dir\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:30.097237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096160 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-kubelet\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.097237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096156 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7073fe4e-234e-40a7-b337-e387d20bd403-agent-certs\") pod \"konnectivity-agent-8nkjl\" (UID: \"7073fe4e-234e-40a7-b337-e387d20bd403\") " pod="kube-system/konnectivity-agent-8nkjl" Apr 16 22:13:30.097237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096171 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d260be2e-0541-4595-95d1-cf52b077b22b-serviceca\") pod \"node-ca-4jzbz\" (UID: \"d260be2e-0541-4595-95d1-cf52b077b22b\") " pod="openshift-image-registry/node-ca-4jzbz" Apr 16 22:13:30.097970 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096119 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-etc-openvswitch\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.097970 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096207 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9775f627-c910-402f-bb3d-4d424e2d7968-etc-selinux\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:30.097970 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096209 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-modprobe-d\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.097970 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096252 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-sysctl-d\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.097970 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096275 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9775f627-c910-402f-bb3d-4d424e2d7968-socket-dir\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:30.097970 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096315 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-system-cni-dir\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:30.097970 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096359 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:30.097970 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096410 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-var-lib-kubelet\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.097970 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096460 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-systemd-units\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.097970 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096532 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-slash\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.097970 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096574 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-multus-socket-dir-parent\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.097970 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096662 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-run-ovn\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.097970 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096706 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9775f627-c910-402f-bb3d-4d424e2d7968-registration-dir\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:30.097970 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096742 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d260be2e-0541-4595-95d1-cf52b077b22b-host\") pod \"node-ca-4jzbz\" (UID: \"d260be2e-0541-4595-95d1-cf52b077b22b\") " pod="openshift-image-registry/node-ca-4jzbz" Apr 16 22:13:30.097970 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096779 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c068e6b8-2d0c-45f9-a80f-87d043b56b89-ovnkube-script-lib\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.097970 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096793 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9775f627-c910-402f-bb3d-4d424e2d7968-registration-dir\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:30.097970 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096798 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bff6b952-968b-4fe3-a43f-333dead963bc-cni-binary-copy\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.098509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096837 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d260be2e-0541-4595-95d1-cf52b077b22b-host\") pod \"node-ca-4jzbz\" (UID: \"d260be2e-0541-4595-95d1-cf52b077b22b\") " pod="openshift-image-registry/node-ca-4jzbz" Apr 16 22:13:30.098509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096879 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c19bbb1b-93c9-40fc-9dc8-bc5463213a6d-tmp-dir\") pod \"node-resolver-l2b5r\" (UID: \"c19bbb1b-93c9-40fc-9dc8-bc5463213a6d\") " pod="openshift-dns/node-resolver-l2b5r" Apr 16 22:13:30.098509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096911 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-cnibin\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.098509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096937 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-host-run-netns\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.098509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096969 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-host-run-netns\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.098509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096968 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-host-var-lib-cni-multus\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.098509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.096999 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-host-var-lib-cni-multus\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.098509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.097000 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-hostroot\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.098509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.097026 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-hostroot\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.098509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.097030 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c068e6b8-2d0c-45f9-a80f-87d043b56b89-ovnkube-config\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.098509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.097037 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-run-netns\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.098509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.097069 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-cnibin\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.098509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.097102 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-host-run-netns\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.098509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.097067 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs\") pod \"network-metrics-daemon-4zqvj\" (UID: \"ef0b8b85-4299-4164-b2f4-ae06377db331\") " pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:30.098509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.097149 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5q78s\" (UniqueName: \"kubernetes.io/projected/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-kube-api-access-5q78s\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:30.098509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.097187 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-run-openvswitch\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.098509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.097192 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c19bbb1b-93c9-40fc-9dc8-bc5463213a6d-tmp-dir\") pod \"node-resolver-l2b5r\" (UID: \"c19bbb1b-93c9-40fc-9dc8-bc5463213a6d\") " pod="openshift-dns/node-resolver-l2b5r" Apr 16 22:13:30.098509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.097285 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bff6b952-968b-4fe3-a43f-333dead963bc-multus-socket-dir-parent\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.099181 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.097302 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:30.099181 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.097341 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c068e6b8-2d0c-45f9-a80f-87d043b56b89-run-openvswitch\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.099181 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.097344 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-host\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.099181 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.097639 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-cni-binary-copy\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:30.099416 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.099395 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c068e6b8-2d0c-45f9-a80f-87d043b56b89-ovn-node-metrics-cert\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.099546 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.099520 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7073fe4e-234e-40a7-b337-e387d20bd403-agent-certs\") pod \"konnectivity-agent-8nkjl\" (UID: \"7073fe4e-234e-40a7-b337-e387d20bd403\") " pod="kube-system/konnectivity-agent-8nkjl" Apr 16 22:13:30.103761 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.103360 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smvrw\" (UniqueName: \"kubernetes.io/projected/bff6b952-968b-4fe3-a43f-333dead963bc-kube-api-access-smvrw\") pod \"multus-qcw82\" (UID: \"bff6b952-968b-4fe3-a43f-333dead963bc\") " pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.103761 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.103527 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpwzs\" (UniqueName: \"kubernetes.io/projected/c068e6b8-2d0c-45f9-a80f-87d043b56b89-kube-api-access-qpwzs\") pod \"ovnkube-node-fwm7d\" (UID: \"c068e6b8-2d0c-45f9-a80f-87d043b56b89\") " pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.103761 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.103697 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmgfb\" (UniqueName: \"kubernetes.io/projected/8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1-kube-api-access-tmgfb\") pod \"iptables-alerter-zshz5\" (UID: \"8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1\") " pod="openshift-network-operator/iptables-alerter-zshz5" Apr 16 22:13:30.104009 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.103864 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb2kr\" (UniqueName: \"kubernetes.io/projected/9775f627-c910-402f-bb3d-4d424e2d7968-kube-api-access-pb2kr\") pod \"aws-ebs-csi-driver-node-cg64r\" (UID: \"9775f627-c910-402f-bb3d-4d424e2d7968\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:30.104105 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.104086 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krnhw\" (UniqueName: \"kubernetes.io/projected/c19bbb1b-93c9-40fc-9dc8-bc5463213a6d-kube-api-access-krnhw\") pod \"node-resolver-l2b5r\" (UID: \"c19bbb1b-93c9-40fc-9dc8-bc5463213a6d\") " pod="openshift-dns/node-resolver-l2b5r" Apr 16 22:13:30.104403 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:30.104379 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:30.104403 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:30.104406 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:30.104535 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:30.104418 2562 projected.go:194] Error preparing data for projected volume kube-api-access-fq2w2 for pod openshift-network-diagnostics/network-check-target-krggd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:30.104535 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:30.104523 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2 podName:d78565aa-9f67-4043-a21a-fe0e9c37b4c3 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:30.604495816 +0000 UTC m=+3.087014932 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fq2w2" (UniqueName: "kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2") pod "network-check-target-krggd" (UID: "d78565aa-9f67-4043-a21a-fe0e9c37b4c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:30.105795 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.105772 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrtmf\" (UniqueName: \"kubernetes.io/projected/d260be2e-0541-4595-95d1-cf52b077b22b-kube-api-access-xrtmf\") pod \"node-ca-4jzbz\" (UID: \"d260be2e-0541-4595-95d1-cf52b077b22b\") " pod="openshift-image-registry/node-ca-4jzbz" Apr 16 22:13:30.110430 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.110402 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q78s\" (UniqueName: \"kubernetes.io/projected/19b43ea6-fab0-42f9-83dc-7b9ced78d6fa-kube-api-access-5q78s\") pod \"multus-additional-cni-plugins-6f9g5\" (UID: \"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa\") " pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:30.198625 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.198572 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-modprobe-d\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.198784 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.198644 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-sysctl-d\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.198784 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.198692 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-var-lib-kubelet\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.198784 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.198737 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs\") pod \"network-metrics-daemon-4zqvj\" (UID: \"ef0b8b85-4299-4164-b2f4-ae06377db331\") " pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:30.198784 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.198770 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-host\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.198965 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.198802 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-sys\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.198965 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.198831 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-lib-modules\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.198965 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.198892 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-tuned\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.198965 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.198935 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-sysconfig\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.199132 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.198971 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hktlt\" (UniqueName: \"kubernetes.io/projected/6d656b01-098b-4cc0-81bb-66f2e3de7643-kube-api-access-hktlt\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.199132 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.199009 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-sysctl-conf\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.199132 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.199041 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d656b01-098b-4cc0-81bb-66f2e3de7643-tmp\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.199132 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.199099 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ch4fd\" (UniqueName: \"kubernetes.io/projected/ef0b8b85-4299-4164-b2f4-ae06377db331-kube-api-access-ch4fd\") pod \"network-metrics-daemon-4zqvj\" (UID: \"ef0b8b85-4299-4164-b2f4-ae06377db331\") " pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:30.199373 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.199131 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-run\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.199373 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.199171 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-kubernetes\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.199373 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.199195 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-systemd\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.199373 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.199305 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-systemd\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.199546 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.199459 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-modprobe-d\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.199591 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.199559 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-sysctl-d\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.199672 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.199656 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-var-lib-kubelet\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.199809 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:30.199794 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:30.199918 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:30.199907 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs podName:ef0b8b85-4299-4164-b2f4-ae06377db331 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:30.699886231 +0000 UTC m=+3.182405362 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs") pod "network-metrics-daemon-4zqvj" (UID: "ef0b8b85-4299-4164-b2f4-ae06377db331") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:30.200246 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.200225 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-host\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.200308 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.200299 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-sys\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.200436 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.200421 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-lib-modules\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.200894 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.200855 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-run\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.200984 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.200872 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-kubernetes\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.200984 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.200940 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-sysctl-conf\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.200984 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.200970 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-sysconfig\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.203176 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.203151 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d656b01-098b-4cc0-81bb-66f2e3de7643-tmp\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.203770 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.203741 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6d656b01-098b-4cc0-81bb-66f2e3de7643-etc-tuned\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.211511 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.211448 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch4fd\" (UniqueName: \"kubernetes.io/projected/ef0b8b85-4299-4164-b2f4-ae06377db331-kube-api-access-ch4fd\") pod \"network-metrics-daemon-4zqvj\" (UID: \"ef0b8b85-4299-4164-b2f4-ae06377db331\") " pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:30.211511 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.211448 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hktlt\" (UniqueName: \"kubernetes.io/projected/6d656b01-098b-4cc0-81bb-66f2e3de7643-kube-api-access-hktlt\") pod \"tuned-c7wd4\" (UID: \"6d656b01-098b-4cc0-81bb-66f2e3de7643\") " pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.284363 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.284322 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" Apr 16 22:13:30.292337 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.292312 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zshz5" Apr 16 22:13:30.300121 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.300083 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8nkjl" Apr 16 22:13:30.305879 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.305854 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:30.312362 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.312309 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4jzbz" Apr 16 22:13:30.320078 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.320049 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l2b5r" Apr 16 22:13:30.327676 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.327653 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6f9g5" Apr 16 22:13:30.335326 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.335296 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qcw82" Apr 16 22:13:30.340982 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.340947 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" Apr 16 22:13:30.451555 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.451518 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9b5sq"] Apr 16 22:13:30.453715 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.453690 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:30.453830 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:30.453777 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9b5sq" podUID="419df959-4512-4006-ba6a-cca963743f66" Apr 16 22:13:30.501976 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.501872 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret\") pod \"global-pull-secret-syncer-9b5sq\" (UID: \"419df959-4512-4006-ba6a-cca963743f66\") " pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:30.501976 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.501948 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/419df959-4512-4006-ba6a-cca963743f66-kubelet-config\") pod \"global-pull-secret-syncer-9b5sq\" (UID: \"419df959-4512-4006-ba6a-cca963743f66\") " pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:30.502208 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.501998 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/419df959-4512-4006-ba6a-cca963743f66-dbus\") pod \"global-pull-secret-syncer-9b5sq\" (UID: \"419df959-4512-4006-ba6a-cca963743f66\") " pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:30.602378 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.602326 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret\") pod \"global-pull-secret-syncer-9b5sq\" (UID: \"419df959-4512-4006-ba6a-cca963743f66\") " pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:30.602565 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.602409 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/419df959-4512-4006-ba6a-cca963743f66-kubelet-config\") pod \"global-pull-secret-syncer-9b5sq\" (UID: \"419df959-4512-4006-ba6a-cca963743f66\") " pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:30.602565 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.602461 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/419df959-4512-4006-ba6a-cca963743f66-dbus\") pod \"global-pull-secret-syncer-9b5sq\" (UID: \"419df959-4512-4006-ba6a-cca963743f66\") " pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:30.602565 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:30.602502 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:30.602565 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.602536 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/419df959-4512-4006-ba6a-cca963743f66-kubelet-config\") pod \"global-pull-secret-syncer-9b5sq\" (UID: \"419df959-4512-4006-ba6a-cca963743f66\") " pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:30.602790 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:30.602572 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret podName:419df959-4512-4006-ba6a-cca963743f66 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:31.102552937 +0000 UTC m=+3.585072051 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret") pod "global-pull-secret-syncer-9b5sq" (UID: "419df959-4512-4006-ba6a-cca963743f66") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:30.602790 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.602647 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/419df959-4512-4006-ba6a-cca963743f66-dbus\") pod \"global-pull-secret-syncer-9b5sq\" (UID: \"419df959-4512-4006-ba6a-cca963743f66\") " pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:30.684224 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:30.684196 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9775f627_c910_402f_bb3d_4d424e2d7968.slice/crio-017eb07c96d4e6a36c1f61feecd22ddf9311f72bec4b705e8d0268b3185ef83f WatchSource:0}: Error finding container 017eb07c96d4e6a36c1f61feecd22ddf9311f72bec4b705e8d0268b3185ef83f: Status 404 returned error can't find the container with id 017eb07c96d4e6a36c1f61feecd22ddf9311f72bec4b705e8d0268b3185ef83f Apr 16 22:13:30.685516 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:30.685477 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a6db1b0_a7c2_4a16_b770_6f40d9c71bf1.slice/crio-48f5d5fb3cf12c1580e9379905cdec2227e94db68c24e7a0f32fa929cfac997b WatchSource:0}: Error finding container 48f5d5fb3cf12c1580e9379905cdec2227e94db68c24e7a0f32fa929cfac997b: Status 404 returned error can't find the container with id 48f5d5fb3cf12c1580e9379905cdec2227e94db68c24e7a0f32fa929cfac997b Apr 16 22:13:30.688761 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:30.688737 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbff6b952_968b_4fe3_a43f_333dead963bc.slice/crio-b4c4e5e4033e26ace4bfd6168a2b2437db879f3686524d8a939db4d2ff9fb3b8 WatchSource:0}: Error finding container b4c4e5e4033e26ace4bfd6168a2b2437db879f3686524d8a939db4d2ff9fb3b8: Status 404 returned error can't find the container with id b4c4e5e4033e26ace4bfd6168a2b2437db879f3686524d8a939db4d2ff9fb3b8 Apr 16 22:13:30.689650 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:30.689624 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc068e6b8_2d0c_45f9_a80f_87d043b56b89.slice/crio-ad600f034022ef8c1004594d8b1af40f2fae35b9a304224abe306efc1e41e03b WatchSource:0}: Error finding container ad600f034022ef8c1004594d8b1af40f2fae35b9a304224abe306efc1e41e03b: Status 404 returned error can't find the container with id ad600f034022ef8c1004594d8b1af40f2fae35b9a304224abe306efc1e41e03b Apr 16 22:13:30.690373 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:30.690349 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7073fe4e_234e_40a7_b337_e387d20bd403.slice/crio-8b517e8333ce7ec2831fef58f4c45f422dca3b75e6893baaad7e3b08907b20be WatchSource:0}: Error finding container 8b517e8333ce7ec2831fef58f4c45f422dca3b75e6893baaad7e3b08907b20be: Status 404 returned error can't find the container with id 8b517e8333ce7ec2831fef58f4c45f422dca3b75e6893baaad7e3b08907b20be Apr 16 22:13:30.691242 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:30.691218 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19b43ea6_fab0_42f9_83dc_7b9ced78d6fa.slice/crio-ab6c5bc847b432e464e046e4380665f1f3d43e5839cfb7907485f0c9415d9189 WatchSource:0}: Error finding container ab6c5bc847b432e464e046e4380665f1f3d43e5839cfb7907485f0c9415d9189: Status 404 returned error can't find the container with id ab6c5bc847b432e464e046e4380665f1f3d43e5839cfb7907485f0c9415d9189 Apr 16 22:13:30.692968 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:30.692925 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd260be2e_0541_4595_95d1_cf52b077b22b.slice/crio-4ec3a0ca0dce482f762d0cbc8ca1946155acf2bdf69cf00296121a2d90f3cec5 WatchSource:0}: Error finding container 4ec3a0ca0dce482f762d0cbc8ca1946155acf2bdf69cf00296121a2d90f3cec5: Status 404 returned error can't find the container with id 4ec3a0ca0dce482f762d0cbc8ca1946155acf2bdf69cf00296121a2d90f3cec5 Apr 16 22:13:30.694468 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:30.694132 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc19bbb1b_93c9_40fc_9dc8_bc5463213a6d.slice/crio-6eb4b5cb6072d2bf6d164f4d2fe5de91650c89823d24d9389e0a8120440a7357 WatchSource:0}: Error finding container 6eb4b5cb6072d2bf6d164f4d2fe5de91650c89823d24d9389e0a8120440a7357: Status 404 returned error can't find the container with id 6eb4b5cb6072d2bf6d164f4d2fe5de91650c89823d24d9389e0a8120440a7357 Apr 16 22:13:30.694796 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:13:30.694704 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d656b01_098b_4cc0_81bb_66f2e3de7643.slice/crio-43d0a459cde5d3cef06e95b9f9dc14384b78d6f61a1dfe6a217f9e9af6f8ca10 WatchSource:0}: Error finding container 43d0a459cde5d3cef06e95b9f9dc14384b78d6f61a1dfe6a217f9e9af6f8ca10: Status 404 returned error can't find the container with id 43d0a459cde5d3cef06e95b9f9dc14384b78d6f61a1dfe6a217f9e9af6f8ca10 Apr 16 22:13:30.703086 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.703054 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fq2w2\" (UniqueName: \"kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2\") pod \"network-check-target-krggd\" (UID: \"d78565aa-9f67-4043-a21a-fe0e9c37b4c3\") " pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:30.703210 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:30.703118 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs\") pod \"network-metrics-daemon-4zqvj\" (UID: \"ef0b8b85-4299-4164-b2f4-ae06377db331\") " pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:30.703275 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:30.703232 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:30.703326 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:30.703294 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs podName:ef0b8b85-4299-4164-b2f4-ae06377db331 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:31.703274686 +0000 UTC m=+4.185793819 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs") pod "network-metrics-daemon-4zqvj" (UID: "ef0b8b85-4299-4164-b2f4-ae06377db331") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:30.703400 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:30.703384 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:30.703450 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:30.703408 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:30.703450 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:30.703421 2562 projected.go:194] Error preparing data for projected volume kube-api-access-fq2w2 for pod openshift-network-diagnostics/network-check-target-krggd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:30.703555 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:30.703465 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2 podName:d78565aa-9f67-4043-a21a-fe0e9c37b4c3 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:31.703449427 +0000 UTC m=+4.185968559 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fq2w2" (UniqueName: "kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2") pod "network-check-target-krggd" (UID: "d78565aa-9f67-4043-a21a-fe0e9c37b4c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:31.023672 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:31.023204 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:29 +0000 UTC" deadline="2027-11-11 10:22:40.761429007 +0000 UTC" Apr 16 22:13:31.023672 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:31.023579 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13764h9m9.737858863s" Apr 16 22:13:31.107565 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:31.107136 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret\") pod \"global-pull-secret-syncer-9b5sq\" (UID: \"419df959-4512-4006-ba6a-cca963743f66\") " pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:31.107565 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:31.107283 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:31.107565 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:31.107345 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret podName:419df959-4512-4006-ba6a-cca963743f66 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:32.107326534 +0000 UTC m=+4.589845655 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret") pod "global-pull-secret-syncer-9b5sq" (UID: "419df959-4512-4006-ba6a-cca963743f66") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:31.126053 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:31.126011 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:31.126214 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:31.126157 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zqvj" podUID="ef0b8b85-4299-4164-b2f4-ae06377db331" Apr 16 22:13:31.147448 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:31.146430 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8nkjl" event={"ID":"7073fe4e-234e-40a7-b337-e387d20bd403","Type":"ContainerStarted","Data":"8b517e8333ce7ec2831fef58f4c45f422dca3b75e6893baaad7e3b08907b20be"} Apr 16 22:13:31.150935 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:31.150890 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" event={"ID":"c068e6b8-2d0c-45f9-a80f-87d043b56b89","Type":"ContainerStarted","Data":"ad600f034022ef8c1004594d8b1af40f2fae35b9a304224abe306efc1e41e03b"} Apr 16 22:13:31.156328 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:31.156060 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" event={"ID":"9775f627-c910-402f-bb3d-4d424e2d7968","Type":"ContainerStarted","Data":"017eb07c96d4e6a36c1f61feecd22ddf9311f72bec4b705e8d0268b3185ef83f"} Apr 16 22:13:31.164755 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:31.164716 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal" event={"ID":"477d8d5a68adc15182b0ab0c3cde7f73","Type":"ContainerStarted","Data":"c9bac4f734e337695a4bbbb8206f9522b90a67c25d1dfd34dc476046a3a520ee"} Apr 16 22:13:31.179086 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:31.179019 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-106.ec2.internal" podStartSLOduration=2.178999309 podStartE2EDuration="2.178999309s" podCreationTimestamp="2026-04-16 22:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:31.178345825 +0000 UTC m=+3.660864962" watchObservedRunningTime="2026-04-16 22:13:31.178999309 +0000 UTC m=+3.661518445" Apr 16 22:13:31.183329 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:31.183289 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" event={"ID":"6d656b01-098b-4cc0-81bb-66f2e3de7643","Type":"ContainerStarted","Data":"43d0a459cde5d3cef06e95b9f9dc14384b78d6f61a1dfe6a217f9e9af6f8ca10"} Apr 16 22:13:31.189475 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:31.189438 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l2b5r" event={"ID":"c19bbb1b-93c9-40fc-9dc8-bc5463213a6d","Type":"ContainerStarted","Data":"6eb4b5cb6072d2bf6d164f4d2fe5de91650c89823d24d9389e0a8120440a7357"} Apr 16 22:13:31.194395 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:31.194358 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qcw82" event={"ID":"bff6b952-968b-4fe3-a43f-333dead963bc","Type":"ContainerStarted","Data":"b4c4e5e4033e26ace4bfd6168a2b2437db879f3686524d8a939db4d2ff9fb3b8"} Apr 16 22:13:31.197533 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:31.197497 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zshz5" event={"ID":"8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1","Type":"ContainerStarted","Data":"48f5d5fb3cf12c1580e9379905cdec2227e94db68c24e7a0f32fa929cfac997b"} Apr 16 22:13:31.207186 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:31.207069 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4jzbz" event={"ID":"d260be2e-0541-4595-95d1-cf52b077b22b","Type":"ContainerStarted","Data":"4ec3a0ca0dce482f762d0cbc8ca1946155acf2bdf69cf00296121a2d90f3cec5"} Apr 16 22:13:31.215263 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:31.215221 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6f9g5" event={"ID":"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa","Type":"ContainerStarted","Data":"ab6c5bc847b432e464e046e4380665f1f3d43e5839cfb7907485f0c9415d9189"} Apr 16 22:13:31.713538 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:31.713495 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs\") pod \"network-metrics-daemon-4zqvj\" (UID: \"ef0b8b85-4299-4164-b2f4-ae06377db331\") " pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:31.713804 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:31.713639 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fq2w2\" (UniqueName: \"kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2\") pod \"network-check-target-krggd\" (UID: \"d78565aa-9f67-4043-a21a-fe0e9c37b4c3\") " pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:31.713804 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:31.713725 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:31.713804 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:31.713771 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:31.713804 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:31.713789 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:31.713804 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:31.713802 2562 projected.go:194] Error preparing data for projected volume kube-api-access-fq2w2 for pod openshift-network-diagnostics/network-check-target-krggd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:31.713804 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:31.713808 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs podName:ef0b8b85-4299-4164-b2f4-ae06377db331 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:33.713785823 +0000 UTC m=+6.196304941 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs") pod "network-metrics-daemon-4zqvj" (UID: "ef0b8b85-4299-4164-b2f4-ae06377db331") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:31.714140 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:31.713851 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2 podName:d78565aa-9f67-4043-a21a-fe0e9c37b4c3 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:33.713835466 +0000 UTC m=+6.196354582 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fq2w2" (UniqueName: "kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2") pod "network-check-target-krggd" (UID: "d78565aa-9f67-4043-a21a-fe0e9c37b4c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:32.117917 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:32.117871 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret\") pod \"global-pull-secret-syncer-9b5sq\" (UID: \"419df959-4512-4006-ba6a-cca963743f66\") " pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:32.118408 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:32.118044 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:32.118408 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:32.118109 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret podName:419df959-4512-4006-ba6a-cca963743f66 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:34.11809032 +0000 UTC m=+6.600609439 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret") pod "global-pull-secret-syncer-9b5sq" (UID: "419df959-4512-4006-ba6a-cca963743f66") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:32.128352 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:32.128318 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:32.128523 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:32.128455 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-krggd" podUID="d78565aa-9f67-4043-a21a-fe0e9c37b4c3" Apr 16 22:13:32.128936 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:32.128916 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:32.129046 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:32.129017 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9b5sq" podUID="419df959-4512-4006-ba6a-cca963743f66" Apr 16 22:13:32.232831 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:32.232535 2562 generic.go:358] "Generic (PLEG): container finished" podID="8d71cd7329eab1e79f952bebf6a7f77b" containerID="7906fb8bd2f9cd36caa8d71a96a424cb995dccbea7d3cf6f5612c0d2f7cf2935" exitCode=0 Apr 16 22:13:32.233413 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:32.233187 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" event={"ID":"8d71cd7329eab1e79f952bebf6a7f77b","Type":"ContainerDied","Data":"7906fb8bd2f9cd36caa8d71a96a424cb995dccbea7d3cf6f5612c0d2f7cf2935"} Apr 16 22:13:33.126989 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:33.126415 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:33.126989 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:33.126568 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zqvj" podUID="ef0b8b85-4299-4164-b2f4-ae06377db331" Apr 16 22:13:33.241637 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:33.241584 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" event={"ID":"8d71cd7329eab1e79f952bebf6a7f77b","Type":"ContainerStarted","Data":"b30284a39d196db037bad58e1e9257289a5fce4d70060e9816b68e54134695d6"} Apr 16 22:13:33.731256 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:33.731205 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs\") pod \"network-metrics-daemon-4zqvj\" (UID: \"ef0b8b85-4299-4164-b2f4-ae06377db331\") " pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:33.731432 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:33.731291 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fq2w2\" (UniqueName: \"kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2\") pod \"network-check-target-krggd\" (UID: \"d78565aa-9f67-4043-a21a-fe0e9c37b4c3\") " pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:33.731494 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:33.731447 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:33.731494 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:33.731465 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:33.731494 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:33.731480 2562 projected.go:194] Error preparing data for projected volume kube-api-access-fq2w2 for pod openshift-network-diagnostics/network-check-target-krggd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:33.731685 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:33.731544 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2 podName:d78565aa-9f67-4043-a21a-fe0e9c37b4c3 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:37.731524627 +0000 UTC m=+10.214043746 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fq2w2" (UniqueName: "kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2") pod "network-check-target-krggd" (UID: "d78565aa-9f67-4043-a21a-fe0e9c37b4c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:33.731685 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:33.731634 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:33.731685 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:33.731675 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs podName:ef0b8b85-4299-4164-b2f4-ae06377db331 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:37.731663978 +0000 UTC m=+10.214183093 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs") pod "network-metrics-daemon-4zqvj" (UID: "ef0b8b85-4299-4164-b2f4-ae06377db331") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:34.126268 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:34.126189 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:34.126268 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:34.126217 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:34.126478 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:34.126325 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-krggd" podUID="d78565aa-9f67-4043-a21a-fe0e9c37b4c3" Apr 16 22:13:34.126478 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:34.126385 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9b5sq" podUID="419df959-4512-4006-ba6a-cca963743f66" Apr 16 22:13:34.134881 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:34.134854 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret\") pod \"global-pull-secret-syncer-9b5sq\" (UID: \"419df959-4512-4006-ba6a-cca963743f66\") " pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:34.135286 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:34.135029 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:34.135286 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:34.135098 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret podName:419df959-4512-4006-ba6a-cca963743f66 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:38.135076983 +0000 UTC m=+10.617596113 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret") pod "global-pull-secret-syncer-9b5sq" (UID: "419df959-4512-4006-ba6a-cca963743f66") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:35.126308 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:35.126270 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:35.126483 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:35.126411 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zqvj" podUID="ef0b8b85-4299-4164-b2f4-ae06377db331" Apr 16 22:13:36.126779 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:36.126298 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:36.126779 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:36.126411 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:36.126779 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:36.126417 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-krggd" podUID="d78565aa-9f67-4043-a21a-fe0e9c37b4c3" Apr 16 22:13:36.126779 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:36.126484 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9b5sq" podUID="419df959-4512-4006-ba6a-cca963743f66" Apr 16 22:13:37.125838 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:37.125803 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:37.126018 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:37.125939 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zqvj" podUID="ef0b8b85-4299-4164-b2f4-ae06377db331" Apr 16 22:13:37.766291 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:37.765538 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fq2w2\" (UniqueName: \"kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2\") pod \"network-check-target-krggd\" (UID: \"d78565aa-9f67-4043-a21a-fe0e9c37b4c3\") " pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:37.766291 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:37.765615 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs\") pod \"network-metrics-daemon-4zqvj\" (UID: \"ef0b8b85-4299-4164-b2f4-ae06377db331\") " pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:37.766291 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:37.765745 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:37.766291 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:37.765803 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs podName:ef0b8b85-4299-4164-b2f4-ae06377db331 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:45.765784558 +0000 UTC m=+18.248303676 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs") pod "network-metrics-daemon-4zqvj" (UID: "ef0b8b85-4299-4164-b2f4-ae06377db331") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:37.766291 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:37.766183 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:37.766291 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:37.766202 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:37.766291 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:37.766215 2562 projected.go:194] Error preparing data for projected volume kube-api-access-fq2w2 for pod openshift-network-diagnostics/network-check-target-krggd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:37.766291 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:37.766258 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2 podName:d78565aa-9f67-4043-a21a-fe0e9c37b4c3 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:45.766244743 +0000 UTC m=+18.248763862 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fq2w2" (UniqueName: "kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2") pod "network-check-target-krggd" (UID: "d78565aa-9f67-4043-a21a-fe0e9c37b4c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:38.127947 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:38.127433 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:38.127947 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:38.127552 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9b5sq" podUID="419df959-4512-4006-ba6a-cca963743f66" Apr 16 22:13:38.127947 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:38.127633 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:38.127947 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:38.127736 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-krggd" podUID="d78565aa-9f67-4043-a21a-fe0e9c37b4c3" Apr 16 22:13:38.169170 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:38.169141 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret\") pod \"global-pull-secret-syncer-9b5sq\" (UID: \"419df959-4512-4006-ba6a-cca963743f66\") " pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:38.169337 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:38.169275 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:38.169337 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:38.169327 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret podName:419df959-4512-4006-ba6a-cca963743f66 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:46.169311586 +0000 UTC m=+18.651830713 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret") pod "global-pull-secret-syncer-9b5sq" (UID: "419df959-4512-4006-ba6a-cca963743f66") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:39.126463 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:39.126400 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:39.126925 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:39.126543 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zqvj" podUID="ef0b8b85-4299-4164-b2f4-ae06377db331" Apr 16 22:13:40.126430 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:40.126347 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:40.126588 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:40.126351 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:40.126588 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:40.126487 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9b5sq" podUID="419df959-4512-4006-ba6a-cca963743f66" Apr 16 22:13:40.127016 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:40.126584 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-krggd" podUID="d78565aa-9f67-4043-a21a-fe0e9c37b4c3" Apr 16 22:13:41.126526 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:41.126483 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:41.126747 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:41.126621 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zqvj" podUID="ef0b8b85-4299-4164-b2f4-ae06377db331" Apr 16 22:13:42.126360 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:42.126323 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:42.126552 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:42.126323 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:42.126552 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:42.126452 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-krggd" podUID="d78565aa-9f67-4043-a21a-fe0e9c37b4c3" Apr 16 22:13:42.126651 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:42.126593 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9b5sq" podUID="419df959-4512-4006-ba6a-cca963743f66" Apr 16 22:13:43.126393 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:43.126361 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:43.126835 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:43.126494 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zqvj" podUID="ef0b8b85-4299-4164-b2f4-ae06377db331" Apr 16 22:13:44.125881 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:44.125843 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:44.126046 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:44.125963 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-krggd" podUID="d78565aa-9f67-4043-a21a-fe0e9c37b4c3" Apr 16 22:13:44.126046 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:44.126034 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:44.126140 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:44.126123 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9b5sq" podUID="419df959-4512-4006-ba6a-cca963743f66" Apr 16 22:13:45.125622 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:45.125578 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:45.126131 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:45.125728 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zqvj" podUID="ef0b8b85-4299-4164-b2f4-ae06377db331" Apr 16 22:13:45.820006 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:45.819967 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs\") pod \"network-metrics-daemon-4zqvj\" (UID: \"ef0b8b85-4299-4164-b2f4-ae06377db331\") " pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:45.820193 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:45.820054 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fq2w2\" (UniqueName: \"kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2\") pod \"network-check-target-krggd\" (UID: \"d78565aa-9f67-4043-a21a-fe0e9c37b4c3\") " pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:45.820193 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:45.820129 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:45.820193 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:45.820154 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:45.820193 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:45.820169 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:45.820193 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:45.820182 2562 projected.go:194] Error preparing data for projected volume kube-api-access-fq2w2 for pod openshift-network-diagnostics/network-check-target-krggd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:45.820426 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:45.820209 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs podName:ef0b8b85-4299-4164-b2f4-ae06377db331 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:01.820189052 +0000 UTC m=+34.302708179 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs") pod "network-metrics-daemon-4zqvj" (UID: "ef0b8b85-4299-4164-b2f4-ae06377db331") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:45.820426 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:45.820225 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2 podName:d78565aa-9f67-4043-a21a-fe0e9c37b4c3 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:01.82021765 +0000 UTC m=+34.302736774 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fq2w2" (UniqueName: "kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2") pod "network-check-target-krggd" (UID: "d78565aa-9f67-4043-a21a-fe0e9c37b4c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:46.126465 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:46.126385 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:46.126465 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:46.126438 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:46.126995 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:46.126526 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9b5sq" podUID="419df959-4512-4006-ba6a-cca963743f66" Apr 16 22:13:46.126995 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:46.126652 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-krggd" podUID="d78565aa-9f67-4043-a21a-fe0e9c37b4c3" Apr 16 22:13:46.223366 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:46.223310 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret\") pod \"global-pull-secret-syncer-9b5sq\" (UID: \"419df959-4512-4006-ba6a-cca963743f66\") " pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:46.223553 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:46.223444 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:46.223553 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:46.223505 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret podName:419df959-4512-4006-ba6a-cca963743f66 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:02.223490043 +0000 UTC m=+34.706009177 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret") pod "global-pull-secret-syncer-9b5sq" (UID: "419df959-4512-4006-ba6a-cca963743f66") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:47.125752 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:47.125710 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:47.125931 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:47.125839 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zqvj" podUID="ef0b8b85-4299-4164-b2f4-ae06377db331" Apr 16 22:13:48.126420 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.126178 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:48.127302 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.126214 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:48.127302 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:48.126819 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9b5sq" podUID="419df959-4512-4006-ba6a-cca963743f66" Apr 16 22:13:48.127452 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:48.127414 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-krggd" podUID="d78565aa-9f67-4043-a21a-fe0e9c37b4c3" Apr 16 22:13:48.269025 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.267909 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" event={"ID":"6d656b01-098b-4cc0-81bb-66f2e3de7643","Type":"ContainerStarted","Data":"5bb95a952d59b7867818740a643b8cd93257a71765415a1b6ef1c813af639707"} Apr 16 22:13:48.270784 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.270748 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l2b5r" event={"ID":"c19bbb1b-93c9-40fc-9dc8-bc5463213a6d","Type":"ContainerStarted","Data":"e1e88d907f3b3a236c64d6cd2e08887b19a04dd2c6d4a2579c163aaba21cc51a"} Apr 16 22:13:48.272063 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.272036 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qcw82" event={"ID":"bff6b952-968b-4fe3-a43f-333dead963bc","Type":"ContainerStarted","Data":"317c4a8f696ba7cf024e1a7c58198f9f56cbda721f32387a757c3bf5fb4ba234"} Apr 16 22:13:48.273347 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.273325 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4jzbz" event={"ID":"d260be2e-0541-4595-95d1-cf52b077b22b","Type":"ContainerStarted","Data":"dd9a06af03c8e960bbbf543a16b8dba80a71d2ee75cffea11a8fa3300aef381c"} Apr 16 22:13:48.274466 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.274439 2562 generic.go:358] "Generic (PLEG): container finished" podID="19b43ea6-fab0-42f9-83dc-7b9ced78d6fa" containerID="9a5d2fb78618e4a1a75fc979db06fd731f9a5d26760c32aff1ff5d2a04782b69" exitCode=0 Apr 16 22:13:48.274559 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.274510 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6f9g5" event={"ID":"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa","Type":"ContainerDied","Data":"9a5d2fb78618e4a1a75fc979db06fd731f9a5d26760c32aff1ff5d2a04782b69"} Apr 16 22:13:48.275800 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.275774 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8nkjl" event={"ID":"7073fe4e-234e-40a7-b337-e387d20bd403","Type":"ContainerStarted","Data":"1744037dd6dcc0613aa086a173bb4e7c7cab459c3529191477c4b40ebf9ce590"} Apr 16 22:13:48.278329 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.278310 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" event={"ID":"c068e6b8-2d0c-45f9-a80f-87d043b56b89","Type":"ContainerStarted","Data":"c237c4565d54737073b44ac32de42131b45997accf6fe063aadf3f65a48a20d3"} Apr 16 22:13:48.278414 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.278336 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" event={"ID":"c068e6b8-2d0c-45f9-a80f-87d043b56b89","Type":"ContainerStarted","Data":"312307ed357a8fcb917bc9f28080dd04b794b4824b1c82405b42c3923439d3d8"} Apr 16 22:13:48.278414 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.278361 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" event={"ID":"c068e6b8-2d0c-45f9-a80f-87d043b56b89","Type":"ContainerStarted","Data":"2e99852a037cea3d7f566553257baf3e8b5a695315ef759fb07aca13fd45e555"} Apr 16 22:13:48.278414 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.278371 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" event={"ID":"c068e6b8-2d0c-45f9-a80f-87d043b56b89","Type":"ContainerStarted","Data":"1e0f1a93b06141c24f971738a58499e60f9b1b46d56693afe242c275cd90262c"} Apr 16 22:13:48.278414 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.278379 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" event={"ID":"c068e6b8-2d0c-45f9-a80f-87d043b56b89","Type":"ContainerStarted","Data":"6ea1e80e6856adbf2b5c3eac2610a1038be30fe698604ac629c8809a77bdf3e3"} Apr 16 22:13:48.278414 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.278386 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" event={"ID":"c068e6b8-2d0c-45f9-a80f-87d043b56b89","Type":"ContainerStarted","Data":"fc5f511c71816244903ab6ac21b4ad07b965f5c133a048cc9cc6c5991427c2ab"} Apr 16 22:13:48.279432 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.279414 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" event={"ID":"9775f627-c910-402f-bb3d-4d424e2d7968","Type":"ContainerStarted","Data":"ee49e2b77c381bdd89471c671011eb3439e4854e43894c86895a3a520704c568"} Apr 16 22:13:48.287335 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.287284 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-106.ec2.internal" podStartSLOduration=19.287268475 podStartE2EDuration="19.287268475s" podCreationTimestamp="2026-04-16 22:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:33.256328786 +0000 UTC m=+5.738847923" watchObservedRunningTime="2026-04-16 22:13:48.287268475 +0000 UTC m=+20.769787611" Apr 16 22:13:48.300681 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.300633 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-c7wd4" podStartSLOduration=3.608244148 podStartE2EDuration="20.300599974s" podCreationTimestamp="2026-04-16 22:13:28 +0000 UTC" firstStartedPulling="2026-04-16 22:13:30.696396757 +0000 UTC m=+3.178915886" lastFinishedPulling="2026-04-16 22:13:47.388752581 +0000 UTC m=+19.871271712" observedRunningTime="2026-04-16 22:13:48.287565907 +0000 UTC m=+20.770085045" watchObservedRunningTime="2026-04-16 22:13:48.300599974 +0000 UTC m=+20.783119112" Apr 16 22:13:48.301192 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.301161 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-8nkjl" podStartSLOduration=3.609480853 podStartE2EDuration="20.301152213s" podCreationTimestamp="2026-04-16 22:13:28 +0000 UTC" firstStartedPulling="2026-04-16 22:13:30.692288351 +0000 UTC m=+3.174807463" lastFinishedPulling="2026-04-16 22:13:47.383959711 +0000 UTC m=+19.866478823" observedRunningTime="2026-04-16 22:13:48.300328776 +0000 UTC m=+20.782847925" watchObservedRunningTime="2026-04-16 22:13:48.301152213 +0000 UTC m=+20.783671352" Apr 16 22:13:48.348594 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.348538 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-l2b5r" podStartSLOduration=3.660257811 podStartE2EDuration="20.348524235s" podCreationTimestamp="2026-04-16 22:13:28 +0000 UTC" firstStartedPulling="2026-04-16 22:13:30.695683645 +0000 UTC m=+3.178202758" lastFinishedPulling="2026-04-16 22:13:47.383950061 +0000 UTC m=+19.866469182" observedRunningTime="2026-04-16 22:13:48.348162625 +0000 UTC m=+20.830681760" watchObservedRunningTime="2026-04-16 22:13:48.348524235 +0000 UTC m=+20.831043371" Apr 16 22:13:48.348930 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.348898 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4jzbz" podStartSLOduration=3.659244868 podStartE2EDuration="20.348891015s" podCreationTimestamp="2026-04-16 22:13:28 +0000 UTC" firstStartedPulling="2026-04-16 22:13:30.69434075 +0000 UTC m=+3.176859868" lastFinishedPulling="2026-04-16 22:13:47.383986888 +0000 UTC m=+19.866506015" observedRunningTime="2026-04-16 22:13:48.331790406 +0000 UTC m=+20.814309553" watchObservedRunningTime="2026-04-16 22:13:48.348891015 +0000 UTC m=+20.831410150" Apr 16 22:13:48.393295 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.393243 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qcw82" podStartSLOduration=3.581843939 podStartE2EDuration="20.393225211s" podCreationTimestamp="2026-04-16 22:13:28 +0000 UTC" firstStartedPulling="2026-04-16 22:13:30.690634922 +0000 UTC m=+3.173154053" lastFinishedPulling="2026-04-16 22:13:47.502016195 +0000 UTC m=+19.984535325" observedRunningTime="2026-04-16 22:13:48.392740834 +0000 UTC m=+20.875259980" watchObservedRunningTime="2026-04-16 22:13:48.393225211 +0000 UTC m=+20.875744347" Apr 16 22:13:48.567119 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:48.567099 2562 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 22:13:49.055953 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:49.055842 2562 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T22:13:48.567116422Z","UUID":"086e1592-4ecf-4aa5-b819-b8231f8e88f4","Handler":null,"Name":"","Endpoint":""} Apr 16 22:13:49.057725 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:49.057700 2562 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 22:13:49.057725 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:49.057732 2562 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 22:13:49.126282 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:49.126252 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:49.126429 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:49.126367 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zqvj" podUID="ef0b8b85-4299-4164-b2f4-ae06377db331" Apr 16 22:13:49.282993 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:49.282958 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" event={"ID":"9775f627-c910-402f-bb3d-4d424e2d7968","Type":"ContainerStarted","Data":"8211bd999fb19f0414cf2a9139eb2412649fb64ee3ff92c92d6ed175af6a6917"} Apr 16 22:13:49.285397 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:49.285094 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zshz5" event={"ID":"8a6db1b0-a7c2-4a16-b770-6f40d9c71bf1","Type":"ContainerStarted","Data":"2232fbf7740b42763ec397d5079a33dbfe5dcdb6d48ca473451bac088b24bb4a"} Apr 16 22:13:49.298219 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:49.298168 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zshz5" podStartSLOduration=4.601711927 podStartE2EDuration="21.298155688s" podCreationTimestamp="2026-04-16 22:13:28 +0000 UTC" firstStartedPulling="2026-04-16 22:13:30.68750249 +0000 UTC m=+3.170021608" lastFinishedPulling="2026-04-16 22:13:47.383946255 +0000 UTC m=+19.866465369" observedRunningTime="2026-04-16 22:13:49.297677856 +0000 UTC m=+21.780196993" watchObservedRunningTime="2026-04-16 22:13:49.298155688 +0000 UTC m=+21.780674825" Apr 16 22:13:50.126175 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:50.126019 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:50.126175 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:50.126056 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:50.126175 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:50.126141 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-krggd" podUID="d78565aa-9f67-4043-a21a-fe0e9c37b4c3" Apr 16 22:13:50.126460 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:50.126294 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9b5sq" podUID="419df959-4512-4006-ba6a-cca963743f66" Apr 16 22:13:50.290214 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:50.290127 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" event={"ID":"c068e6b8-2d0c-45f9-a80f-87d043b56b89","Type":"ContainerStarted","Data":"e03a9213c2d97e77b6c44549413a6289e05cd4b721a7fd2cdc3d9ae3d0dda4a9"} Apr 16 22:13:50.292114 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:50.292080 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" event={"ID":"9775f627-c910-402f-bb3d-4d424e2d7968","Type":"ContainerStarted","Data":"fbbbaef4b4203e5ee986181b9b17c418e5fdf5dcdc9eab87fa110446d8073cf5"} Apr 16 22:13:51.126246 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:51.126200 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:51.126463 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:51.126359 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zqvj" podUID="ef0b8b85-4299-4164-b2f4-ae06377db331" Apr 16 22:13:51.732622 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:51.732373 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-8nkjl" Apr 16 22:13:51.733061 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:51.733036 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-8nkjl" Apr 16 22:13:51.748731 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:51.748662 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cg64r" podStartSLOduration=5.173915234 podStartE2EDuration="23.748646269s" podCreationTimestamp="2026-04-16 22:13:28 +0000 UTC" firstStartedPulling="2026-04-16 22:13:30.687085243 +0000 UTC m=+3.169604357" lastFinishedPulling="2026-04-16 22:13:49.261816275 +0000 UTC m=+21.744335392" observedRunningTime="2026-04-16 22:13:50.325515368 +0000 UTC m=+22.808034502" watchObservedRunningTime="2026-04-16 22:13:51.748646269 +0000 UTC m=+24.231165404" Apr 16 22:13:52.126247 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:52.126206 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:52.126247 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:52.126245 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:52.126457 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:52.126335 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-krggd" podUID="d78565aa-9f67-4043-a21a-fe0e9c37b4c3" Apr 16 22:13:52.126519 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:52.126479 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9b5sq" podUID="419df959-4512-4006-ba6a-cca963743f66" Apr 16 22:13:52.299072 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:52.298909 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-8nkjl" Apr 16 22:13:52.299345 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:52.299331 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-8nkjl" Apr 16 22:13:53.126035 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:53.126001 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:53.126171 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:53.126104 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zqvj" podUID="ef0b8b85-4299-4164-b2f4-ae06377db331" Apr 16 22:13:53.298980 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:53.298902 2562 generic.go:358] "Generic (PLEG): container finished" podID="19b43ea6-fab0-42f9-83dc-7b9ced78d6fa" containerID="254d7a7ebaec95af590e493c517a5f63115cdbf85546e0df7817014a96eef11b" exitCode=0 Apr 16 22:13:53.299811 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:53.298983 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6f9g5" event={"ID":"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa","Type":"ContainerDied","Data":"254d7a7ebaec95af590e493c517a5f63115cdbf85546e0df7817014a96eef11b"} Apr 16 22:13:53.302368 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:53.302179 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" event={"ID":"c068e6b8-2d0c-45f9-a80f-87d043b56b89","Type":"ContainerStarted","Data":"cb672649dae7c86ed645dedddaa4b6467de95f8bd9eb0975abe9e66dd2f13f0f"} Apr 16 22:13:53.302483 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:53.302465 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:53.302541 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:53.302489 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:53.316874 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:53.316853 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:53.316945 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:53.316915 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:53.349309 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:53.349269 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" podStartSLOduration=8.476410249 podStartE2EDuration="25.349258509s" podCreationTimestamp="2026-04-16 22:13:28 +0000 UTC" firstStartedPulling="2026-04-16 22:13:30.691844509 +0000 UTC m=+3.174363623" lastFinishedPulling="2026-04-16 22:13:47.56469277 +0000 UTC m=+20.047211883" observedRunningTime="2026-04-16 22:13:53.348824885 +0000 UTC m=+25.831344020" watchObservedRunningTime="2026-04-16 22:13:53.349258509 +0000 UTC m=+25.831777643" Apr 16 22:13:54.125791 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:54.125757 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:54.125965 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:54.125765 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:54.125965 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:54.125885 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9b5sq" podUID="419df959-4512-4006-ba6a-cca963743f66" Apr 16 22:13:54.125965 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:54.125944 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-krggd" podUID="d78565aa-9f67-4043-a21a-fe0e9c37b4c3" Apr 16 22:13:54.304207 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:54.304179 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 22:13:54.305085 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:54.305049 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4zqvj"] Apr 16 22:13:54.305215 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:54.305141 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:54.305275 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:54.305244 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zqvj" podUID="ef0b8b85-4299-4164-b2f4-ae06377db331" Apr 16 22:13:54.308094 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:54.308071 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-krggd"] Apr 16 22:13:54.308209 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:54.308160 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:54.308260 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:54.308230 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-krggd" podUID="d78565aa-9f67-4043-a21a-fe0e9c37b4c3" Apr 16 22:13:54.308760 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:54.308743 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9b5sq"] Apr 16 22:13:54.308835 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:54.308823 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:54.308930 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:54.308911 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9b5sq" podUID="419df959-4512-4006-ba6a-cca963743f66" Apr 16 22:13:55.307935 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:55.307904 2562 generic.go:358] "Generic (PLEG): container finished" podID="19b43ea6-fab0-42f9-83dc-7b9ced78d6fa" containerID="a60b9bf0e0f6bdb6fa97e296a304a8bb2af69dfda80f50c1f657e5958001c431" exitCode=0 Apr 16 22:13:55.308334 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:55.307985 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6f9g5" event={"ID":"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa","Type":"ContainerDied","Data":"a60b9bf0e0f6bdb6fa97e296a304a8bb2af69dfda80f50c1f657e5958001c431"} Apr 16 22:13:55.308334 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:55.308212 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 22:13:55.913922 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:55.913887 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:13:56.126028 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:56.125991 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:56.126236 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:56.125996 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:56.126236 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:56.126089 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-krggd" podUID="d78565aa-9f67-4043-a21a-fe0e9c37b4c3" Apr 16 22:13:56.126236 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:56.126186 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9b5sq" podUID="419df959-4512-4006-ba6a-cca963743f66" Apr 16 22:13:56.126236 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:56.126004 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:56.126409 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:56.126289 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zqvj" podUID="ef0b8b85-4299-4164-b2f4-ae06377db331" Apr 16 22:13:57.312521 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:57.312487 2562 generic.go:358] "Generic (PLEG): container finished" podID="19b43ea6-fab0-42f9-83dc-7b9ced78d6fa" containerID="b8b5742ce7bb7b0b4ae5b20c172aeccfca5b92ba4ecb44edbfa81868c245de75" exitCode=0 Apr 16 22:13:57.313029 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:57.312530 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6f9g5" event={"ID":"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa","Type":"ContainerDied","Data":"b8b5742ce7bb7b0b4ae5b20c172aeccfca5b92ba4ecb44edbfa81868c245de75"} Apr 16 22:13:58.127449 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:58.127272 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:13:58.127655 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:58.127518 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9b5sq" podUID="419df959-4512-4006-ba6a-cca963743f66" Apr 16 22:13:58.127655 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:58.127354 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:13:58.127655 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:13:58.127384 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:13:58.127655 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:58.127637 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zqvj" podUID="ef0b8b85-4299-4164-b2f4-ae06377db331" Apr 16 22:13:58.127875 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:13:58.127684 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-krggd" podUID="d78565aa-9f67-4043-a21a-fe0e9c37b4c3" Apr 16 22:14:00.125720 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.125690 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:14:00.126154 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.125686 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:14:00.126154 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:00.125832 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zqvj" podUID="ef0b8b85-4299-4164-b2f4-ae06377db331" Apr 16 22:14:00.126154 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:00.125878 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-krggd" podUID="d78565aa-9f67-4043-a21a-fe0e9c37b4c3" Apr 16 22:14:00.126154 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.125686 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:14:00.126154 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:00.125981 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9b5sq" podUID="419df959-4512-4006-ba6a-cca963743f66" Apr 16 22:14:00.273427 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.273399 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-106.ec2.internal" event="NodeReady" Apr 16 22:14:00.273630 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.273554 2562 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 22:14:00.311761 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.311680 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6b85c8896b-qn2l2"] Apr 16 22:14:00.351498 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.351454 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b85c8896b-qn2l2"] Apr 16 22:14:00.351498 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.351488 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zprfg"] Apr 16 22:14:00.352579 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.351871 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.354464 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.354439 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-s46c2\"" Apr 16 22:14:00.354968 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.354949 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 22:14:00.359054 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.358891 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 22:14:00.359054 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.358967 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 22:14:00.365839 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.365821 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zprfg"] Apr 16 22:14:00.365945 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.365931 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zprfg" Apr 16 22:14:00.368635 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.368598 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nlsdl\"" Apr 16 22:14:00.369148 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.369126 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 22:14:00.369421 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.369406 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 22:14:00.384537 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.384518 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 22:14:00.432411 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.432388 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xg95t"] Apr 16 22:14:00.434867 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.434847 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5d30e65-e56d-4830-9544-0d047de3e6e6-ca-trust-extracted\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.434980 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.434886 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5d30e65-e56d-4830-9544-0d047de3e6e6-image-registry-private-configuration\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.434980 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.434920 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr4f7\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-kube-api-access-sr4f7\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.435098 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.434997 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-certificates\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.435098 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.435039 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5d30e65-e56d-4830-9544-0d047de3e6e6-trusted-ca\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.435098 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.435085 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-bound-sa-token\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.435211 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.435114 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.435211 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.435168 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5d30e65-e56d-4830-9544-0d047de3e6e6-installation-pull-secrets\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.447206 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.447130 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xg95t"] Apr 16 22:14:00.447313 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.447226 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xg95t" Apr 16 22:14:00.450125 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.450068 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 22:14:00.450463 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.450254 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-skfrk\"" Apr 16 22:14:00.450463 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.450294 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 22:14:00.450463 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.450384 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 22:14:00.536041 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.536005 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-tmp-dir\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:14:00.536202 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.536077 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:14:00.536202 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.536128 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5d30e65-e56d-4830-9544-0d047de3e6e6-installation-pull-secrets\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.536301 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.536217 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7d6g\" (UniqueName: \"kubernetes.io/projected/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-kube-api-access-t7d6g\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:14:00.536301 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.536284 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8ssz\" (UniqueName: \"kubernetes.io/projected/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-kube-api-access-c8ssz\") pod \"ingress-canary-xg95t\" (UID: \"262b53c6-89e8-4fcb-9d2d-6de1c03648ad\") " pod="openshift-ingress-canary/ingress-canary-xg95t" Apr 16 22:14:00.536383 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.536327 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5d30e65-e56d-4830-9544-0d047de3e6e6-ca-trust-extracted\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.536422 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.536392 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5d30e65-e56d-4830-9544-0d047de3e6e6-image-registry-private-configuration\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.536468 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.536447 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sr4f7\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-kube-api-access-sr4f7\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.536519 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.536487 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-certificates\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.536557 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.536516 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5d30e65-e56d-4830-9544-0d047de3e6e6-trusted-ca\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.536557 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.536541 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-bound-sa-token\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.536655 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.536581 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-config-volume\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:14:00.536655 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.536625 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.536655 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.536648 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert\") pod \"ingress-canary-xg95t\" (UID: \"262b53c6-89e8-4fcb-9d2d-6de1c03648ad\") " pod="openshift-ingress-canary/ingress-canary-xg95t" Apr 16 22:14:00.536771 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.536679 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5d30e65-e56d-4830-9544-0d047de3e6e6-ca-trust-extracted\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.536994 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:00.536972 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:00.537099 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:00.536997 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b85c8896b-qn2l2: secret "image-registry-tls" not found Apr 16 22:14:00.537099 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:00.537066 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls podName:f5d30e65-e56d-4830-9544-0d047de3e6e6 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:01.03704172 +0000 UTC m=+33.519560847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls") pod "image-registry-6b85c8896b-qn2l2" (UID: "f5d30e65-e56d-4830-9544-0d047de3e6e6") : secret "image-registry-tls" not found Apr 16 22:14:00.537361 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.537334 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-certificates\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.538297 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.538277 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5d30e65-e56d-4830-9544-0d047de3e6e6-trusted-ca\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.540808 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.540785 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5d30e65-e56d-4830-9544-0d047de3e6e6-image-registry-private-configuration\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.540872 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.540814 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5d30e65-e56d-4830-9544-0d047de3e6e6-installation-pull-secrets\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.545798 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.545780 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr4f7\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-kube-api-access-sr4f7\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.546140 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.546126 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-bound-sa-token\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:00.637760 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.637690 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-tmp-dir\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:14:00.637760 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.637756 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:14:00.637947 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.637782 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7d6g\" (UniqueName: \"kubernetes.io/projected/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-kube-api-access-t7d6g\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:14:00.637947 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.637803 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8ssz\" (UniqueName: \"kubernetes.io/projected/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-kube-api-access-c8ssz\") pod \"ingress-canary-xg95t\" (UID: \"262b53c6-89e8-4fcb-9d2d-6de1c03648ad\") " pod="openshift-ingress-canary/ingress-canary-xg95t" Apr 16 22:14:00.637947 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.637869 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-config-volume\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:14:00.637947 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.637883 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-tmp-dir\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:14:00.637947 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:00.637895 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:00.637947 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.637891 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert\") pod \"ingress-canary-xg95t\" (UID: \"262b53c6-89e8-4fcb-9d2d-6de1c03648ad\") " pod="openshift-ingress-canary/ingress-canary-xg95t" Apr 16 22:14:00.638169 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:00.637954 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:00.638169 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:00.637962 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls podName:896dfd17-7377-42ff-b2c1-0ff2bbb1909a nodeName:}" failed. No retries permitted until 2026-04-16 22:14:01.13794128 +0000 UTC m=+33.620460392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls") pod "dns-default-zprfg" (UID: "896dfd17-7377-42ff-b2c1-0ff2bbb1909a") : secret "dns-default-metrics-tls" not found Apr 16 22:14:00.638169 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:00.637999 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert podName:262b53c6-89e8-4fcb-9d2d-6de1c03648ad nodeName:}" failed. No retries permitted until 2026-04-16 22:14:01.137984452 +0000 UTC m=+33.620503568 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert") pod "ingress-canary-xg95t" (UID: "262b53c6-89e8-4fcb-9d2d-6de1c03648ad") : secret "canary-serving-cert" not found Apr 16 22:14:00.640765 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.638758 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-config-volume\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:14:00.647942 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.647922 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8ssz\" (UniqueName: \"kubernetes.io/projected/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-kube-api-access-c8ssz\") pod \"ingress-canary-xg95t\" (UID: \"262b53c6-89e8-4fcb-9d2d-6de1c03648ad\") " pod="openshift-ingress-canary/ingress-canary-xg95t" Apr 16 22:14:00.648208 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:00.648192 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7d6g\" (UniqueName: \"kubernetes.io/projected/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-kube-api-access-t7d6g\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:14:01.041017 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:01.040977 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:01.041211 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:01.041128 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:01.041211 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:01.041141 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b85c8896b-qn2l2: secret "image-registry-tls" not found Apr 16 22:14:01.041211 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:01.041194 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls podName:f5d30e65-e56d-4830-9544-0d047de3e6e6 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:02.0411786 +0000 UTC m=+34.523697725 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls") pod "image-registry-6b85c8896b-qn2l2" (UID: "f5d30e65-e56d-4830-9544-0d047de3e6e6") : secret "image-registry-tls" not found Apr 16 22:14:01.142531 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:01.142489 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert\") pod \"ingress-canary-xg95t\" (UID: \"262b53c6-89e8-4fcb-9d2d-6de1c03648ad\") " pod="openshift-ingress-canary/ingress-canary-xg95t" Apr 16 22:14:01.143329 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:01.142555 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:14:01.143329 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:01.142666 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:01.143329 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:01.142747 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert podName:262b53c6-89e8-4fcb-9d2d-6de1c03648ad nodeName:}" failed. No retries permitted until 2026-04-16 22:14:02.142731291 +0000 UTC m=+34.625250408 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert") pod "ingress-canary-xg95t" (UID: "262b53c6-89e8-4fcb-9d2d-6de1c03648ad") : secret "canary-serving-cert" not found Apr 16 22:14:01.143329 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:01.142676 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:01.143329 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:01.142853 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls podName:896dfd17-7377-42ff-b2c1-0ff2bbb1909a nodeName:}" failed. No retries permitted until 2026-04-16 22:14:02.142836253 +0000 UTC m=+34.625355366 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls") pod "dns-default-zprfg" (UID: "896dfd17-7377-42ff-b2c1-0ff2bbb1909a") : secret "dns-default-metrics-tls" not found Apr 16 22:14:01.848755 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:01.848705 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fq2w2\" (UniqueName: \"kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2\") pod \"network-check-target-krggd\" (UID: \"d78565aa-9f67-4043-a21a-fe0e9c37b4c3\") " pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:14:01.848992 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:01.848809 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs\") pod \"network-metrics-daemon-4zqvj\" (UID: \"ef0b8b85-4299-4164-b2f4-ae06377db331\") " pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:14:01.848992 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:01.848882 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:14:01.848992 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:01.848904 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:14:01.848992 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:01.848918 2562 projected.go:194] Error preparing data for projected volume kube-api-access-fq2w2 for pod openshift-network-diagnostics/network-check-target-krggd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:01.848992 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:01.848943 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:01.848992 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:01.848975 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2 podName:d78565aa-9f67-4043-a21a-fe0e9c37b4c3 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:33.848959052 +0000 UTC m=+66.331478196 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-fq2w2" (UniqueName: "kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2") pod "network-check-target-krggd" (UID: "d78565aa-9f67-4043-a21a-fe0e9c37b4c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:01.849239 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:01.849022 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs podName:ef0b8b85-4299-4164-b2f4-ae06377db331 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:33.849009742 +0000 UTC m=+66.331528855 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs") pod "network-metrics-daemon-4zqvj" (UID: "ef0b8b85-4299-4164-b2f4-ae06377db331") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:02.050056 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:02.050024 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:02.050238 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:02.050194 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:02.050238 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:02.050216 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b85c8896b-qn2l2: secret "image-registry-tls" not found Apr 16 22:14:02.050353 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:02.050284 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls podName:f5d30e65-e56d-4830-9544-0d047de3e6e6 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:04.050262918 +0000 UTC m=+36.532782048 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls") pod "image-registry-6b85c8896b-qn2l2" (UID: "f5d30e65-e56d-4830-9544-0d047de3e6e6") : secret "image-registry-tls" not found Apr 16 22:14:02.126634 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:02.126512 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:14:02.126634 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:02.126537 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:14:02.126817 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:02.126655 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:14:02.129330 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:02.129299 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:02.130629 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:02.130552 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 22:14:02.130629 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:02.130614 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:02.130814 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:02.130634 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-28226\"" Apr 16 22:14:02.130814 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:02.130638 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:02.130814 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:02.130638 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hqwt5\"" Apr 16 22:14:02.150501 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:02.150480 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:14:02.150852 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:02.150549 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert\") pod \"ingress-canary-xg95t\" (UID: \"262b53c6-89e8-4fcb-9d2d-6de1c03648ad\") " pod="openshift-ingress-canary/ingress-canary-xg95t" Apr 16 22:14:02.150852 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:02.150655 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:02.150852 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:02.150663 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:02.150852 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:02.150717 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert podName:262b53c6-89e8-4fcb-9d2d-6de1c03648ad nodeName:}" failed. No retries permitted until 2026-04-16 22:14:04.150703245 +0000 UTC m=+36.633222357 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert") pod "ingress-canary-xg95t" (UID: "262b53c6-89e8-4fcb-9d2d-6de1c03648ad") : secret "canary-serving-cert" not found Apr 16 22:14:02.150852 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:02.150733 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls podName:896dfd17-7377-42ff-b2c1-0ff2bbb1909a nodeName:}" failed. No retries permitted until 2026-04-16 22:14:04.150724743 +0000 UTC m=+36.633243856 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls") pod "dns-default-zprfg" (UID: "896dfd17-7377-42ff-b2c1-0ff2bbb1909a") : secret "dns-default-metrics-tls" not found Apr 16 22:14:02.251413 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:02.251383 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret\") pod \"global-pull-secret-syncer-9b5sq\" (UID: \"419df959-4512-4006-ba6a-cca963743f66\") " pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:14:02.254041 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:02.254016 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/419df959-4512-4006-ba6a-cca963743f66-original-pull-secret\") pod \"global-pull-secret-syncer-9b5sq\" (UID: \"419df959-4512-4006-ba6a-cca963743f66\") " pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:14:02.439181 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:02.439097 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9b5sq" Apr 16 22:14:03.010615 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:03.010576 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9b5sq"] Apr 16 22:14:03.092416 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:14:03.092386 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod419df959_4512_4006_ba6a_cca963743f66.slice/crio-18b2a78e51db85e5b073d6e38f62f3a935496444557ecc2e0b6660dd8f6469da WatchSource:0}: Error finding container 18b2a78e51db85e5b073d6e38f62f3a935496444557ecc2e0b6660dd8f6469da: Status 404 returned error can't find the container with id 18b2a78e51db85e5b073d6e38f62f3a935496444557ecc2e0b6660dd8f6469da Apr 16 22:14:03.326294 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:03.326063 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9b5sq" event={"ID":"419df959-4512-4006-ba6a-cca963743f66","Type":"ContainerStarted","Data":"18b2a78e51db85e5b073d6e38f62f3a935496444557ecc2e0b6660dd8f6469da"} Apr 16 22:14:03.328599 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:03.328571 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6f9g5" event={"ID":"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa","Type":"ContainerStarted","Data":"7380863c5d273975998c31ca2612f43c87b8169ce8099a99d026a9d4b03c9101"} Apr 16 22:14:04.067469 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:04.067439 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:04.067649 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:04.067599 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:04.067649 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:04.067631 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b85c8896b-qn2l2: secret "image-registry-tls" not found Apr 16 22:14:04.067765 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:04.067687 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls podName:f5d30e65-e56d-4830-9544-0d047de3e6e6 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:08.067669497 +0000 UTC m=+40.550188632 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls") pod "image-registry-6b85c8896b-qn2l2" (UID: "f5d30e65-e56d-4830-9544-0d047de3e6e6") : secret "image-registry-tls" not found Apr 16 22:14:04.168672 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:04.168598 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert\") pod \"ingress-canary-xg95t\" (UID: \"262b53c6-89e8-4fcb-9d2d-6de1c03648ad\") " pod="openshift-ingress-canary/ingress-canary-xg95t" Apr 16 22:14:04.168672 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:04.168669 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:14:04.168862 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:04.168712 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:04.168862 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:04.168782 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:04.168862 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:04.168794 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert podName:262b53c6-89e8-4fcb-9d2d-6de1c03648ad nodeName:}" failed. No retries permitted until 2026-04-16 22:14:08.168769951 +0000 UTC m=+40.651289089 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert") pod "ingress-canary-xg95t" (UID: "262b53c6-89e8-4fcb-9d2d-6de1c03648ad") : secret "canary-serving-cert" not found Apr 16 22:14:04.168862 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:04.168828 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls podName:896dfd17-7377-42ff-b2c1-0ff2bbb1909a nodeName:}" failed. No retries permitted until 2026-04-16 22:14:08.168815923 +0000 UTC m=+40.651335037 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls") pod "dns-default-zprfg" (UID: "896dfd17-7377-42ff-b2c1-0ff2bbb1909a") : secret "dns-default-metrics-tls" not found Apr 16 22:14:04.333519 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:04.333487 2562 generic.go:358] "Generic (PLEG): container finished" podID="19b43ea6-fab0-42f9-83dc-7b9ced78d6fa" containerID="7380863c5d273975998c31ca2612f43c87b8169ce8099a99d026a9d4b03c9101" exitCode=0 Apr 16 22:14:04.333905 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:04.333549 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6f9g5" event={"ID":"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa","Type":"ContainerDied","Data":"7380863c5d273975998c31ca2612f43c87b8169ce8099a99d026a9d4b03c9101"} Apr 16 22:14:05.338376 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:05.338340 2562 generic.go:358] "Generic (PLEG): container finished" podID="19b43ea6-fab0-42f9-83dc-7b9ced78d6fa" containerID="97d4ab19efd9603c138b8d9c74b12dbbbd840f63c6be3b513c9fad1d73cf7348" exitCode=0 Apr 16 22:14:05.338937 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:05.338398 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6f9g5" event={"ID":"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa","Type":"ContainerDied","Data":"97d4ab19efd9603c138b8d9c74b12dbbbd840f63c6be3b513c9fad1d73cf7348"} Apr 16 22:14:06.344154 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:06.344117 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6f9g5" event={"ID":"19b43ea6-fab0-42f9-83dc-7b9ced78d6fa","Type":"ContainerStarted","Data":"e73dd99a9d09f84634260d673f45f92a8be0a994e75df884d7bb1d51c8e74b84"} Apr 16 22:14:06.371871 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:06.371812 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6f9g5" podStartSLOduration=5.931139775 podStartE2EDuration="38.371794286s" podCreationTimestamp="2026-04-16 22:13:28 +0000 UTC" firstStartedPulling="2026-04-16 22:13:30.694355866 +0000 UTC m=+3.176874979" lastFinishedPulling="2026-04-16 22:14:03.135010376 +0000 UTC m=+35.617529490" observedRunningTime="2026-04-16 22:14:06.370186089 +0000 UTC m=+38.852705227" watchObservedRunningTime="2026-04-16 22:14:06.371794286 +0000 UTC m=+38.854313421" Apr 16 22:14:08.106284 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:08.106230 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:08.106843 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:08.106474 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:08.106843 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:08.106494 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b85c8896b-qn2l2: secret "image-registry-tls" not found Apr 16 22:14:08.106843 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:08.106577 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls podName:f5d30e65-e56d-4830-9544-0d047de3e6e6 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:16.106553521 +0000 UTC m=+48.589072651 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls") pod "image-registry-6b85c8896b-qn2l2" (UID: "f5d30e65-e56d-4830-9544-0d047de3e6e6") : secret "image-registry-tls" not found Apr 16 22:14:08.207437 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:08.207410 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:14:08.207590 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:08.207473 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert\") pod \"ingress-canary-xg95t\" (UID: \"262b53c6-89e8-4fcb-9d2d-6de1c03648ad\") " pod="openshift-ingress-canary/ingress-canary-xg95t" Apr 16 22:14:08.207590 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:08.207552 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:08.207590 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:08.207553 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:08.207710 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:08.207596 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert podName:262b53c6-89e8-4fcb-9d2d-6de1c03648ad nodeName:}" failed. No retries permitted until 2026-04-16 22:14:16.207583891 +0000 UTC m=+48.690103005 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert") pod "ingress-canary-xg95t" (UID: "262b53c6-89e8-4fcb-9d2d-6de1c03648ad") : secret "canary-serving-cert" not found Apr 16 22:14:08.207710 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:08.207626 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls podName:896dfd17-7377-42ff-b2c1-0ff2bbb1909a nodeName:}" failed. No retries permitted until 2026-04-16 22:14:16.207619455 +0000 UTC m=+48.690138568 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls") pod "dns-default-zprfg" (UID: "896dfd17-7377-42ff-b2c1-0ff2bbb1909a") : secret "dns-default-metrics-tls" not found Apr 16 22:14:08.349007 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:08.348974 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9b5sq" event={"ID":"419df959-4512-4006-ba6a-cca963743f66","Type":"ContainerStarted","Data":"401848ae6238842b14d0ff8a0808bd3f08844c9c4ea92ba0f0a60de135c0ca77"} Apr 16 22:14:08.364756 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:08.364670 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9b5sq" podStartSLOduration=34.01386162 podStartE2EDuration="38.364657517s" podCreationTimestamp="2026-04-16 22:13:30 +0000 UTC" firstStartedPulling="2026-04-16 22:14:03.112168999 +0000 UTC m=+35.594688111" lastFinishedPulling="2026-04-16 22:14:07.462964895 +0000 UTC m=+39.945484008" observedRunningTime="2026-04-16 22:14:08.36396088 +0000 UTC m=+40.846480016" watchObservedRunningTime="2026-04-16 22:14:08.364657517 +0000 UTC m=+40.847176652" Apr 16 22:14:10.988316 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:10.988284 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl"] Apr 16 22:14:11.022525 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.022483 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl"] Apr 16 22:14:11.022717 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.022627 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.025050 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.025024 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 22:14:11.025203 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.025055 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 22:14:11.025203 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.025088 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 22:14:11.025315 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.025250 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 22:14:11.025905 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.025882 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 22:14:11.026036 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.025914 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 22:14:11.026036 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.025961 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 22:14:11.027350 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.027330 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/14c99cff-d122-4d8b-9021-5b7ae6333efa-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6944d7fd56-cm7bl\" (UID: \"14c99cff-d122-4d8b-9021-5b7ae6333efa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.027436 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.027362 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7284\" (UniqueName: \"kubernetes.io/projected/14c99cff-d122-4d8b-9021-5b7ae6333efa-kube-api-access-n7284\") pod \"cluster-proxy-proxy-agent-6944d7fd56-cm7bl\" (UID: \"14c99cff-d122-4d8b-9021-5b7ae6333efa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.027436 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.027396 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/14c99cff-d122-4d8b-9021-5b7ae6333efa-hub\") pod \"cluster-proxy-proxy-agent-6944d7fd56-cm7bl\" (UID: \"14c99cff-d122-4d8b-9021-5b7ae6333efa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.027532 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.027486 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/14c99cff-d122-4d8b-9021-5b7ae6333efa-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6944d7fd56-cm7bl\" (UID: \"14c99cff-d122-4d8b-9021-5b7ae6333efa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.027532 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.027515 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/14c99cff-d122-4d8b-9021-5b7ae6333efa-ca\") pod \"cluster-proxy-proxy-agent-6944d7fd56-cm7bl\" (UID: \"14c99cff-d122-4d8b-9021-5b7ae6333efa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.027621 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.027587 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/14c99cff-d122-4d8b-9021-5b7ae6333efa-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6944d7fd56-cm7bl\" (UID: \"14c99cff-d122-4d8b-9021-5b7ae6333efa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.128207 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.128168 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/14c99cff-d122-4d8b-9021-5b7ae6333efa-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6944d7fd56-cm7bl\" (UID: \"14c99cff-d122-4d8b-9021-5b7ae6333efa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.128409 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.128258 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/14c99cff-d122-4d8b-9021-5b7ae6333efa-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6944d7fd56-cm7bl\" (UID: \"14c99cff-d122-4d8b-9021-5b7ae6333efa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.128409 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.128299 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7284\" (UniqueName: \"kubernetes.io/projected/14c99cff-d122-4d8b-9021-5b7ae6333efa-kube-api-access-n7284\") pod \"cluster-proxy-proxy-agent-6944d7fd56-cm7bl\" (UID: \"14c99cff-d122-4d8b-9021-5b7ae6333efa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.128409 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.128329 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/14c99cff-d122-4d8b-9021-5b7ae6333efa-hub\") pod \"cluster-proxy-proxy-agent-6944d7fd56-cm7bl\" (UID: \"14c99cff-d122-4d8b-9021-5b7ae6333efa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.128557 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.128494 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/14c99cff-d122-4d8b-9021-5b7ae6333efa-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6944d7fd56-cm7bl\" (UID: \"14c99cff-d122-4d8b-9021-5b7ae6333efa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.128557 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.128537 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/14c99cff-d122-4d8b-9021-5b7ae6333efa-ca\") pod \"cluster-proxy-proxy-agent-6944d7fd56-cm7bl\" (UID: \"14c99cff-d122-4d8b-9021-5b7ae6333efa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.129143 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.129114 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/14c99cff-d122-4d8b-9021-5b7ae6333efa-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6944d7fd56-cm7bl\" (UID: \"14c99cff-d122-4d8b-9021-5b7ae6333efa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.132742 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.132715 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/14c99cff-d122-4d8b-9021-5b7ae6333efa-ca\") pod \"cluster-proxy-proxy-agent-6944d7fd56-cm7bl\" (UID: \"14c99cff-d122-4d8b-9021-5b7ae6333efa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.132858 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.132837 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/14c99cff-d122-4d8b-9021-5b7ae6333efa-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6944d7fd56-cm7bl\" (UID: \"14c99cff-d122-4d8b-9021-5b7ae6333efa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.132896 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.132853 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/14c99cff-d122-4d8b-9021-5b7ae6333efa-hub\") pod \"cluster-proxy-proxy-agent-6944d7fd56-cm7bl\" (UID: \"14c99cff-d122-4d8b-9021-5b7ae6333efa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.132974 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.132956 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/14c99cff-d122-4d8b-9021-5b7ae6333efa-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6944d7fd56-cm7bl\" (UID: \"14c99cff-d122-4d8b-9021-5b7ae6333efa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.140695 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.140659 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7284\" (UniqueName: \"kubernetes.io/projected/14c99cff-d122-4d8b-9021-5b7ae6333efa-kube-api-access-n7284\") pod \"cluster-proxy-proxy-agent-6944d7fd56-cm7bl\" (UID: \"14c99cff-d122-4d8b-9021-5b7ae6333efa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.347599 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.347560 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:14:11.477862 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:11.477828 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl"] Apr 16 22:14:11.480719 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:14:11.480692 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14c99cff_d122_4d8b_9021_5b7ae6333efa.slice/crio-e81340b7b656c84fc8b50913cb422a8ad996b6ee77dacc323ebde0eac199646f WatchSource:0}: Error finding container e81340b7b656c84fc8b50913cb422a8ad996b6ee77dacc323ebde0eac199646f: Status 404 returned error can't find the container with id e81340b7b656c84fc8b50913cb422a8ad996b6ee77dacc323ebde0eac199646f Apr 16 22:14:12.358638 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:12.358593 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" event={"ID":"14c99cff-d122-4d8b-9021-5b7ae6333efa","Type":"ContainerStarted","Data":"e81340b7b656c84fc8b50913cb422a8ad996b6ee77dacc323ebde0eac199646f"} Apr 16 22:14:15.365310 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:15.365250 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" event={"ID":"14c99cff-d122-4d8b-9021-5b7ae6333efa","Type":"ContainerStarted","Data":"29287047e5761de761518dd2a7af7f2a21f3fe09867bef147d125dded0b8e1b8"} Apr 16 22:14:16.159655 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:16.159622 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:16.159814 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:16.159760 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:16.159814 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:16.159778 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b85c8896b-qn2l2: secret "image-registry-tls" not found Apr 16 22:14:16.159896 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:16.159827 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls podName:f5d30e65-e56d-4830-9544-0d047de3e6e6 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:32.15981335 +0000 UTC m=+64.642332464 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls") pod "image-registry-6b85c8896b-qn2l2" (UID: "f5d30e65-e56d-4830-9544-0d047de3e6e6") : secret "image-registry-tls" not found Apr 16 22:14:16.260285 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:16.260251 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert\") pod \"ingress-canary-xg95t\" (UID: \"262b53c6-89e8-4fcb-9d2d-6de1c03648ad\") " pod="openshift-ingress-canary/ingress-canary-xg95t" Apr 16 22:14:16.260443 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:16.260318 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:14:16.260443 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:16.260401 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:16.260443 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:16.260421 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:16.260629 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:16.260467 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert podName:262b53c6-89e8-4fcb-9d2d-6de1c03648ad nodeName:}" failed. No retries permitted until 2026-04-16 22:14:32.260449079 +0000 UTC m=+64.742968207 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert") pod "ingress-canary-xg95t" (UID: "262b53c6-89e8-4fcb-9d2d-6de1c03648ad") : secret "canary-serving-cert" not found Apr 16 22:14:16.260629 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:16.260481 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls podName:896dfd17-7377-42ff-b2c1-0ff2bbb1909a nodeName:}" failed. No retries permitted until 2026-04-16 22:14:32.260475109 +0000 UTC m=+64.742994222 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls") pod "dns-default-zprfg" (UID: "896dfd17-7377-42ff-b2c1-0ff2bbb1909a") : secret "dns-default-metrics-tls" not found Apr 16 22:14:18.372430 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:18.372400 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" event={"ID":"14c99cff-d122-4d8b-9021-5b7ae6333efa","Type":"ContainerStarted","Data":"c759604321624b9cf86d932f83ca22f774784314620604b61e11899e331aa42e"} Apr 16 22:14:18.372766 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:18.372438 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" event={"ID":"14c99cff-d122-4d8b-9021-5b7ae6333efa","Type":"ContainerStarted","Data":"e3b0460c053655afc5dfa728fc6cf2eb8a9974dc68b57d10dd2f1c7f5c8e760b"} Apr 16 22:14:18.391471 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:18.391423 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" podStartSLOduration=1.630680372 podStartE2EDuration="8.391408586s" podCreationTimestamp="2026-04-16 22:14:10 +0000 UTC" firstStartedPulling="2026-04-16 22:14:11.482528302 +0000 UTC m=+43.965047420" lastFinishedPulling="2026-04-16 22:14:18.243256521 +0000 UTC m=+50.725775634" observedRunningTime="2026-04-16 22:14:18.390733218 +0000 UTC m=+50.873252354" watchObservedRunningTime="2026-04-16 22:14:18.391408586 +0000 UTC m=+50.873927721" Apr 16 22:14:26.320295 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:26.320265 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fwm7d" Apr 16 22:14:32.175236 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:32.175192 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:14:32.175746 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:32.175345 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:14:32.175746 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:32.175366 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b85c8896b-qn2l2: secret "image-registry-tls" not found Apr 16 22:14:32.175746 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:32.175440 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls podName:f5d30e65-e56d-4830-9544-0d047de3e6e6 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:04.175423691 +0000 UTC m=+96.657942804 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls") pod "image-registry-6b85c8896b-qn2l2" (UID: "f5d30e65-e56d-4830-9544-0d047de3e6e6") : secret "image-registry-tls" not found Apr 16 22:14:32.276099 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:32.276065 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:14:32.276205 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:32.276135 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert\") pod \"ingress-canary-xg95t\" (UID: \"262b53c6-89e8-4fcb-9d2d-6de1c03648ad\") " pod="openshift-ingress-canary/ingress-canary-xg95t" Apr 16 22:14:32.276247 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:32.276215 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:32.276247 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:32.276215 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:32.276308 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:32.276265 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert podName:262b53c6-89e8-4fcb-9d2d-6de1c03648ad nodeName:}" failed. No retries permitted until 2026-04-16 22:15:04.276252932 +0000 UTC m=+96.758772044 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert") pod "ingress-canary-xg95t" (UID: "262b53c6-89e8-4fcb-9d2d-6de1c03648ad") : secret "canary-serving-cert" not found Apr 16 22:14:32.276308 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:32.276277 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls podName:896dfd17-7377-42ff-b2c1-0ff2bbb1909a nodeName:}" failed. No retries permitted until 2026-04-16 22:15:04.276271392 +0000 UTC m=+96.758790504 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls") pod "dns-default-zprfg" (UID: "896dfd17-7377-42ff-b2c1-0ff2bbb1909a") : secret "dns-default-metrics-tls" not found Apr 16 22:14:33.889131 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:33.889096 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fq2w2\" (UniqueName: \"kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2\") pod \"network-check-target-krggd\" (UID: \"d78565aa-9f67-4043-a21a-fe0e9c37b4c3\") " pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:14:33.889547 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:33.889155 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs\") pod \"network-metrics-daemon-4zqvj\" (UID: \"ef0b8b85-4299-4164-b2f4-ae06377db331\") " pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:14:33.891618 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:33.891584 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:33.891717 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:33.891690 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:33.899489 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:33.899470 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:14:33.899615 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:14:33.899525 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs podName:ef0b8b85-4299-4164-b2f4-ae06377db331 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:37.899510173 +0000 UTC m=+130.382029291 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs") pod "network-metrics-daemon-4zqvj" (UID: "ef0b8b85-4299-4164-b2f4-ae06377db331") : secret "metrics-daemon-secret" not found Apr 16 22:14:33.901600 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:33.901583 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:33.912567 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:33.912551 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq2w2\" (UniqueName: \"kubernetes.io/projected/d78565aa-9f67-4043-a21a-fe0e9c37b4c3-kube-api-access-fq2w2\") pod \"network-check-target-krggd\" (UID: \"d78565aa-9f67-4043-a21a-fe0e9c37b4c3\") " pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:14:33.952828 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:33.952805 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-28226\"" Apr 16 22:14:33.961554 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:33.961538 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:14:34.078868 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:34.078659 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-krggd"] Apr 16 22:14:34.082969 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:14:34.082942 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd78565aa_9f67_4043_a21a_fe0e9c37b4c3.slice/crio-d033a6266113cc3f1321fe2468fec58d47990df4aa082563622fa8fb99e79ce6 WatchSource:0}: Error finding container d033a6266113cc3f1321fe2468fec58d47990df4aa082563622fa8fb99e79ce6: Status 404 returned error can't find the container with id d033a6266113cc3f1321fe2468fec58d47990df4aa082563622fa8fb99e79ce6 Apr 16 22:14:34.404153 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:34.404120 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-krggd" event={"ID":"d78565aa-9f67-4043-a21a-fe0e9c37b4c3","Type":"ContainerStarted","Data":"d033a6266113cc3f1321fe2468fec58d47990df4aa082563622fa8fb99e79ce6"} Apr 16 22:14:37.411353 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:37.411317 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-krggd" event={"ID":"d78565aa-9f67-4043-a21a-fe0e9c37b4c3","Type":"ContainerStarted","Data":"a251c4072b704bb4c2d3e58413f70f0e274ee808e5aac0d4c68dbb51ddc2e7d1"} Apr 16 22:14:37.411779 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:37.411439 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:14:37.426172 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:14:37.426130 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-krggd" podStartSLOduration=66.830388322 podStartE2EDuration="1m9.426119369s" podCreationTimestamp="2026-04-16 22:13:28 +0000 UTC" firstStartedPulling="2026-04-16 22:14:34.085339823 +0000 UTC m=+66.567858942" lastFinishedPulling="2026-04-16 22:14:36.681070862 +0000 UTC m=+69.163589989" observedRunningTime="2026-04-16 22:14:37.424723385 +0000 UTC m=+69.907242520" watchObservedRunningTime="2026-04-16 22:14:37.426119369 +0000 UTC m=+69.908638503" Apr 16 22:15:04.202708 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:04.202626 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:15:04.203083 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:15:04.202783 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:15:04.203083 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:15:04.202800 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b85c8896b-qn2l2: secret "image-registry-tls" not found Apr 16 22:15:04.203083 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:15:04.202863 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls podName:f5d30e65-e56d-4830-9544-0d047de3e6e6 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:08.202847897 +0000 UTC m=+160.685367029 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls") pod "image-registry-6b85c8896b-qn2l2" (UID: "f5d30e65-e56d-4830-9544-0d047de3e6e6") : secret "image-registry-tls" not found Apr 16 22:15:04.303081 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:04.303039 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:15:04.303205 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:04.303147 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert\") pod \"ingress-canary-xg95t\" (UID: \"262b53c6-89e8-4fcb-9d2d-6de1c03648ad\") " pod="openshift-ingress-canary/ingress-canary-xg95t" Apr 16 22:15:04.303243 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:15:04.303202 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:15:04.303276 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:15:04.303261 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:15:04.303308 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:15:04.303282 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls podName:896dfd17-7377-42ff-b2c1-0ff2bbb1909a nodeName:}" failed. No retries permitted until 2026-04-16 22:16:08.303264048 +0000 UTC m=+160.785783164 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls") pod "dns-default-zprfg" (UID: "896dfd17-7377-42ff-b2c1-0ff2bbb1909a") : secret "dns-default-metrics-tls" not found Apr 16 22:15:04.303355 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:15:04.303315 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert podName:262b53c6-89e8-4fcb-9d2d-6de1c03648ad nodeName:}" failed. No retries permitted until 2026-04-16 22:16:08.303298521 +0000 UTC m=+160.785817655 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert") pod "ingress-canary-xg95t" (UID: "262b53c6-89e8-4fcb-9d2d-6de1c03648ad") : secret "canary-serving-cert" not found Apr 16 22:15:08.415905 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:08.415876 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-krggd" Apr 16 22:15:27.898470 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:27.898439 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-l2b5r_c19bbb1b-93c9-40fc-9dc8-bc5463213a6d/dns-node-resolver/0.log" Apr 16 22:15:28.697984 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:28.697958 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4jzbz_d260be2e-0541-4595-95d1-cf52b077b22b/node-ca/0.log" Apr 16 22:15:37.962615 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:37.962561 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs\") pod \"network-metrics-daemon-4zqvj\" (UID: \"ef0b8b85-4299-4164-b2f4-ae06377db331\") " pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:15:37.963076 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:15:37.962717 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:15:37.963076 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:15:37.962789 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs podName:ef0b8b85-4299-4164-b2f4-ae06377db331 nodeName:}" failed. No retries permitted until 2026-04-16 22:17:39.962773052 +0000 UTC m=+252.445292170 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs") pod "network-metrics-daemon-4zqvj" (UID: "ef0b8b85-4299-4164-b2f4-ae06377db331") : secret "metrics-daemon-secret" not found Apr 16 22:15:51.889876 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.889841 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6wx9g"] Apr 16 22:15:51.892816 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.892800 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6wx9g" Apr 16 22:15:51.895704 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.895683 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 22:15:51.895821 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.895703 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 22:15:51.896404 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.896383 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 22:15:51.896550 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.896417 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 22:15:51.896934 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.896909 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bblwd\"" Apr 16 22:15:51.906656 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.906631 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6wx9g"] Apr 16 22:15:51.962590 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.962552 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8d8e91a5-19e6-40c2-b353-674e0838577c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6wx9g\" (UID: \"8d8e91a5-19e6-40c2-b353-674e0838577c\") " pod="openshift-insights/insights-runtime-extractor-6wx9g" Apr 16 22:15:51.962801 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.962624 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8d8e91a5-19e6-40c2-b353-674e0838577c-crio-socket\") pod \"insights-runtime-extractor-6wx9g\" (UID: \"8d8e91a5-19e6-40c2-b353-674e0838577c\") " pod="openshift-insights/insights-runtime-extractor-6wx9g" Apr 16 22:15:51.962801 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.962664 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8d8e91a5-19e6-40c2-b353-674e0838577c-data-volume\") pod \"insights-runtime-extractor-6wx9g\" (UID: \"8d8e91a5-19e6-40c2-b353-674e0838577c\") " pod="openshift-insights/insights-runtime-extractor-6wx9g" Apr 16 22:15:51.962801 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.962690 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkwvl\" (UniqueName: \"kubernetes.io/projected/8d8e91a5-19e6-40c2-b353-674e0838577c-kube-api-access-tkwvl\") pod \"insights-runtime-extractor-6wx9g\" (UID: \"8d8e91a5-19e6-40c2-b353-674e0838577c\") " pod="openshift-insights/insights-runtime-extractor-6wx9g" Apr 16 22:15:51.962801 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.962728 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8d8e91a5-19e6-40c2-b353-674e0838577c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6wx9g\" (UID: \"8d8e91a5-19e6-40c2-b353-674e0838577c\") " pod="openshift-insights/insights-runtime-extractor-6wx9g" Apr 16 22:15:51.974214 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.974184 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-nc9dz"] Apr 16 22:15:51.976901 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.976885 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-nc9dz" Apr 16 22:15:51.979789 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.979764 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 22:15:51.980805 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.980785 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 22:15:51.980876 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.980841 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-g42df\"" Apr 16 22:15:51.989955 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:51.988782 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-nc9dz"] Apr 16 22:15:52.063998 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.063971 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8d8e91a5-19e6-40c2-b353-674e0838577c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6wx9g\" (UID: \"8d8e91a5-19e6-40c2-b353-674e0838577c\") " pod="openshift-insights/insights-runtime-extractor-6wx9g" Apr 16 22:15:52.064126 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.064005 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8j2s\" (UniqueName: \"kubernetes.io/projected/92f991e3-505f-4a83-870d-ff28b1ed3fad-kube-api-access-n8j2s\") pod \"downloads-6bcc868b7-nc9dz\" (UID: \"92f991e3-505f-4a83-870d-ff28b1ed3fad\") " pod="openshift-console/downloads-6bcc868b7-nc9dz" Apr 16 22:15:52.064126 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.064040 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8d8e91a5-19e6-40c2-b353-674e0838577c-crio-socket\") pod \"insights-runtime-extractor-6wx9g\" (UID: \"8d8e91a5-19e6-40c2-b353-674e0838577c\") " pod="openshift-insights/insights-runtime-extractor-6wx9g" Apr 16 22:15:52.064126 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.064094 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8d8e91a5-19e6-40c2-b353-674e0838577c-data-volume\") pod \"insights-runtime-extractor-6wx9g\" (UID: \"8d8e91a5-19e6-40c2-b353-674e0838577c\") " pod="openshift-insights/insights-runtime-extractor-6wx9g" Apr 16 22:15:52.064126 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.064116 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8d8e91a5-19e6-40c2-b353-674e0838577c-crio-socket\") pod \"insights-runtime-extractor-6wx9g\" (UID: \"8d8e91a5-19e6-40c2-b353-674e0838577c\") " pod="openshift-insights/insights-runtime-extractor-6wx9g" Apr 16 22:15:52.064252 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.064133 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkwvl\" (UniqueName: \"kubernetes.io/projected/8d8e91a5-19e6-40c2-b353-674e0838577c-kube-api-access-tkwvl\") pod \"insights-runtime-extractor-6wx9g\" (UID: \"8d8e91a5-19e6-40c2-b353-674e0838577c\") " pod="openshift-insights/insights-runtime-extractor-6wx9g" Apr 16 22:15:52.064252 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.064178 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8d8e91a5-19e6-40c2-b353-674e0838577c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6wx9g\" (UID: \"8d8e91a5-19e6-40c2-b353-674e0838577c\") " pod="openshift-insights/insights-runtime-extractor-6wx9g" Apr 16 22:15:52.064468 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.064439 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8d8e91a5-19e6-40c2-b353-674e0838577c-data-volume\") pod \"insights-runtime-extractor-6wx9g\" (UID: \"8d8e91a5-19e6-40c2-b353-674e0838577c\") " pod="openshift-insights/insights-runtime-extractor-6wx9g" Apr 16 22:15:52.064667 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.064652 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8d8e91a5-19e6-40c2-b353-674e0838577c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6wx9g\" (UID: \"8d8e91a5-19e6-40c2-b353-674e0838577c\") " pod="openshift-insights/insights-runtime-extractor-6wx9g" Apr 16 22:15:52.066442 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.066420 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8d8e91a5-19e6-40c2-b353-674e0838577c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6wx9g\" (UID: \"8d8e91a5-19e6-40c2-b353-674e0838577c\") " pod="openshift-insights/insights-runtime-extractor-6wx9g" Apr 16 22:15:52.072656 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.072634 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkwvl\" (UniqueName: \"kubernetes.io/projected/8d8e91a5-19e6-40c2-b353-674e0838577c-kube-api-access-tkwvl\") pod \"insights-runtime-extractor-6wx9g\" (UID: \"8d8e91a5-19e6-40c2-b353-674e0838577c\") " pod="openshift-insights/insights-runtime-extractor-6wx9g" Apr 16 22:15:52.165746 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.165670 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8j2s\" (UniqueName: \"kubernetes.io/projected/92f991e3-505f-4a83-870d-ff28b1ed3fad-kube-api-access-n8j2s\") pod \"downloads-6bcc868b7-nc9dz\" (UID: \"92f991e3-505f-4a83-870d-ff28b1ed3fad\") " pod="openshift-console/downloads-6bcc868b7-nc9dz" Apr 16 22:15:52.176848 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.176828 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8j2s\" (UniqueName: \"kubernetes.io/projected/92f991e3-505f-4a83-870d-ff28b1ed3fad-kube-api-access-n8j2s\") pod \"downloads-6bcc868b7-nc9dz\" (UID: \"92f991e3-505f-4a83-870d-ff28b1ed3fad\") " pod="openshift-console/downloads-6bcc868b7-nc9dz" Apr 16 22:15:52.201659 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.201636 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6wx9g" Apr 16 22:15:52.286070 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.286044 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-nc9dz" Apr 16 22:15:52.319272 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.319243 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6wx9g"] Apr 16 22:15:52.323934 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:15:52.323911 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d8e91a5_19e6_40c2_b353_674e0838577c.slice/crio-3882f9068a8ba97bdc1894e19f785de8e357ef34a366225125306b4f4bc2f0a1 WatchSource:0}: Error finding container 3882f9068a8ba97bdc1894e19f785de8e357ef34a366225125306b4f4bc2f0a1: Status 404 returned error can't find the container with id 3882f9068a8ba97bdc1894e19f785de8e357ef34a366225125306b4f4bc2f0a1 Apr 16 22:15:52.403390 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.403350 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-nc9dz"] Apr 16 22:15:52.416756 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:15:52.416697 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92f991e3_505f_4a83_870d_ff28b1ed3fad.slice/crio-6f577ca83484c2ccbb1d26e6af2a1602c578f2c20fcdaf1a639680e32b1b536a WatchSource:0}: Error finding container 6f577ca83484c2ccbb1d26e6af2a1602c578f2c20fcdaf1a639680e32b1b536a: Status 404 returned error can't find the container with id 6f577ca83484c2ccbb1d26e6af2a1602c578f2c20fcdaf1a639680e32b1b536a Apr 16 22:15:52.557962 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.557916 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-nc9dz" event={"ID":"92f991e3-505f-4a83-870d-ff28b1ed3fad","Type":"ContainerStarted","Data":"6f577ca83484c2ccbb1d26e6af2a1602c578f2c20fcdaf1a639680e32b1b536a"} Apr 16 22:15:52.559201 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.559177 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6wx9g" event={"ID":"8d8e91a5-19e6-40c2-b353-674e0838577c","Type":"ContainerStarted","Data":"5a1d076a88b76311ad8cff5fe908ea5990cf930ffdf266fee4797b2e02804c3b"} Apr 16 22:15:52.559299 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:52.559209 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6wx9g" event={"ID":"8d8e91a5-19e6-40c2-b353-674e0838577c","Type":"ContainerStarted","Data":"3882f9068a8ba97bdc1894e19f785de8e357ef34a366225125306b4f4bc2f0a1"} Apr 16 22:15:53.563471 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:53.563433 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6wx9g" event={"ID":"8d8e91a5-19e6-40c2-b353-674e0838577c","Type":"ContainerStarted","Data":"4ab0b546217983259e0a7cec0e89065d35287abed888a47a98b78b96803854ba"} Apr 16 22:15:54.567143 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:54.567106 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6wx9g" event={"ID":"8d8e91a5-19e6-40c2-b353-674e0838577c","Type":"ContainerStarted","Data":"aa598320e04100ca46e664690fad0d1b3c516af6ef0015170f480a1c4a21b5ec"} Apr 16 22:15:54.584967 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:15:54.584905 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6wx9g" podStartSLOduration=1.504752339 podStartE2EDuration="3.584885016s" podCreationTimestamp="2026-04-16 22:15:51 +0000 UTC" firstStartedPulling="2026-04-16 22:15:52.39643175 +0000 UTC m=+144.878950864" lastFinishedPulling="2026-04-16 22:15:54.476564413 +0000 UTC m=+146.959083541" observedRunningTime="2026-04-16 22:15:54.583335051 +0000 UTC m=+147.065854188" watchObservedRunningTime="2026-04-16 22:15:54.584885016 +0000 UTC m=+147.067404184" Apr 16 22:16:03.363660 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:16:03.363597 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" podUID="f5d30e65-e56d-4830-9544-0d047de3e6e6" Apr 16 22:16:03.374737 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:16:03.374702 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-zprfg" podUID="896dfd17-7377-42ff-b2c1-0ff2bbb1909a" Apr 16 22:16:03.458127 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:16:03.458087 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-xg95t" podUID="262b53c6-89e8-4fcb-9d2d-6de1c03648ad" Apr 16 22:16:03.591191 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:03.591156 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xg95t" Apr 16 22:16:04.038155 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.038120 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-xtdtz"] Apr 16 22:16:04.041597 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.041569 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.046493 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.046431 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 22:16:04.047101 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.047082 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 22:16:04.047821 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.047799 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 22:16:04.048599 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.048578 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 22:16:04.048907 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.048890 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 22:16:04.049108 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.049076 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 22:16:04.049216 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.049203 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-4vs7f\"" Apr 16 22:16:04.170216 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.170145 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/50c10cf9-25e9-4cb5-a882-5cb45721bda9-root\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.170216 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.170207 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/50c10cf9-25e9-4cb5-a882-5cb45721bda9-node-exporter-accelerators-collector-config\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.170397 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.170243 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/50c10cf9-25e9-4cb5-a882-5cb45721bda9-node-exporter-tls\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.170444 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.170391 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/50c10cf9-25e9-4cb5-a882-5cb45721bda9-node-exporter-textfile\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.170444 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.170425 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/50c10cf9-25e9-4cb5-a882-5cb45721bda9-node-exporter-wtmp\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.170548 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.170477 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50c10cf9-25e9-4cb5-a882-5cb45721bda9-sys\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.170548 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.170531 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/50c10cf9-25e9-4cb5-a882-5cb45721bda9-metrics-client-ca\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.170665 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.170565 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mzfw\" (UniqueName: \"kubernetes.io/projected/50c10cf9-25e9-4cb5-a882-5cb45721bda9-kube-api-access-8mzfw\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.170702 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.170668 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/50c10cf9-25e9-4cb5-a882-5cb45721bda9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.271316 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.271283 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/50c10cf9-25e9-4cb5-a882-5cb45721bda9-root\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.271494 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.271336 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/50c10cf9-25e9-4cb5-a882-5cb45721bda9-node-exporter-accelerators-collector-config\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.271494 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.271412 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/50c10cf9-25e9-4cb5-a882-5cb45721bda9-root\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.271494 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.271462 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/50c10cf9-25e9-4cb5-a882-5cb45721bda9-node-exporter-tls\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.271693 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.271566 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/50c10cf9-25e9-4cb5-a882-5cb45721bda9-node-exporter-textfile\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.271693 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.271627 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/50c10cf9-25e9-4cb5-a882-5cb45721bda9-node-exporter-wtmp\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.271693 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.271655 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50c10cf9-25e9-4cb5-a882-5cb45721bda9-sys\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.271844 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.271692 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/50c10cf9-25e9-4cb5-a882-5cb45721bda9-metrics-client-ca\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.271844 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.271723 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mzfw\" (UniqueName: \"kubernetes.io/projected/50c10cf9-25e9-4cb5-a882-5cb45721bda9-kube-api-access-8mzfw\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.271844 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.271754 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/50c10cf9-25e9-4cb5-a882-5cb45721bda9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.271844 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.271779 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/50c10cf9-25e9-4cb5-a882-5cb45721bda9-node-exporter-wtmp\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.272073 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.271849 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50c10cf9-25e9-4cb5-a882-5cb45721bda9-sys\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.272073 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.271923 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/50c10cf9-25e9-4cb5-a882-5cb45721bda9-node-exporter-textfile\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.272073 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.272024 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/50c10cf9-25e9-4cb5-a882-5cb45721bda9-node-exporter-accelerators-collector-config\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.272384 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.272358 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/50c10cf9-25e9-4cb5-a882-5cb45721bda9-metrics-client-ca\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.274187 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.274166 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/50c10cf9-25e9-4cb5-a882-5cb45721bda9-node-exporter-tls\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.274384 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.274366 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/50c10cf9-25e9-4cb5-a882-5cb45721bda9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.308713 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.308687 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mzfw\" (UniqueName: \"kubernetes.io/projected/50c10cf9-25e9-4cb5-a882-5cb45721bda9-kube-api-access-8mzfw\") pod \"node-exporter-xtdtz\" (UID: \"50c10cf9-25e9-4cb5-a882-5cb45721bda9\") " pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:04.352764 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:04.352736 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xtdtz" Apr 16 22:16:05.144522 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:16:05.144485 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-4zqvj" podUID="ef0b8b85-4299-4164-b2f4-ae06377db331" Apr 16 22:16:05.180403 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.180367 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:16:05.185500 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.185473 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.193850 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.193823 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 22:16:05.193963 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.193911 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 22:16:05.196407 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.196386 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 22:16:05.196652 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.196638 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-8thgp\"" Apr 16 22:16:05.196725 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.196642 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 22:16:05.197536 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.197518 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 22:16:05.198821 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.198806 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 22:16:05.209043 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.209024 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:16:05.211421 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.211398 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 22:16:05.211557 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.211404 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 22:16:05.215353 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.215314 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 22:16:05.281180 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.281151 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/881f7b46-8c34-4d07-9ba4-ced81175999e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.281353 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.281206 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.281353 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.281285 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.281353 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.281322 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.281490 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.281408 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/881f7b46-8c34-4d07-9ba4-ced81175999e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.281490 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.281441 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.281490 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.281470 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/881f7b46-8c34-4d07-9ba4-ced81175999e-config-out\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.281631 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.281514 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-web-config\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.281631 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.281560 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.281631 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.281593 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-config-volume\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.281753 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.281654 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/881f7b46-8c34-4d07-9ba4-ced81175999e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.281753 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.281679 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbldp\" (UniqueName: \"kubernetes.io/projected/881f7b46-8c34-4d07-9ba4-ced81175999e-kube-api-access-cbldp\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.281753 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.281714 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/881f7b46-8c34-4d07-9ba4-ced81175999e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.382191 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.382158 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/881f7b46-8c34-4d07-9ba4-ced81175999e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.382385 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.382213 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.382385 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.382244 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.382385 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.382267 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.383205 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.383176 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/881f7b46-8c34-4d07-9ba4-ced81175999e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.383325 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.383246 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.383325 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.383282 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/881f7b46-8c34-4d07-9ba4-ced81175999e-config-out\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.383442 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.383351 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-web-config\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.383442 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.383391 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.383539 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.383444 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-config-volume\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.383539 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.383489 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/881f7b46-8c34-4d07-9ba4-ced81175999e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.383539 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.383522 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbldp\" (UniqueName: \"kubernetes.io/projected/881f7b46-8c34-4d07-9ba4-ced81175999e-kube-api-access-cbldp\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.383702 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.383582 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/881f7b46-8c34-4d07-9ba4-ced81175999e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.383754 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.383732 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/881f7b46-8c34-4d07-9ba4-ced81175999e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.385933 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.384078 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/881f7b46-8c34-4d07-9ba4-ced81175999e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.385933 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.384496 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/881f7b46-8c34-4d07-9ba4-ced81175999e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.387468 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.387444 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.388187 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.388036 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.389947 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.389927 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/881f7b46-8c34-4d07-9ba4-ced81175999e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.390742 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.390168 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.391739 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.391715 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/881f7b46-8c34-4d07-9ba4-ced81175999e-config-out\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.391995 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.391975 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.392128 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.392042 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.393554 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.393530 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-config-volume\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.393854 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.393835 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-web-config\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.484392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.484322 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbldp\" (UniqueName: \"kubernetes.io/projected/881f7b46-8c34-4d07-9ba4-ced81175999e-kube-api-access-cbldp\") pod \"alertmanager-main-0\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:05.497158 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:05.497130 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:16:08.207417 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.207378 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:16:08.210189 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.210157 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls\") pod \"image-registry-6b85c8896b-qn2l2\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:16:08.308411 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.308371 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:16:08.308591 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.308434 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert\") pod \"ingress-canary-xg95t\" (UID: \"262b53c6-89e8-4fcb-9d2d-6de1c03648ad\") " pod="openshift-ingress-canary/ingress-canary-xg95t" Apr 16 22:16:08.311046 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.311016 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/896dfd17-7377-42ff-b2c1-0ff2bbb1909a-metrics-tls\") pod \"dns-default-zprfg\" (UID: \"896dfd17-7377-42ff-b2c1-0ff2bbb1909a\") " pod="openshift-dns/dns-default-zprfg" Apr 16 22:16:08.311161 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.311073 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/262b53c6-89e8-4fcb-9d2d-6de1c03648ad-cert\") pod \"ingress-canary-xg95t\" (UID: \"262b53c6-89e8-4fcb-9d2d-6de1c03648ad\") " pod="openshift-ingress-canary/ingress-canary-xg95t" Apr 16 22:16:08.394316 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.394285 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-skfrk\"" Apr 16 22:16:08.402700 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.402675 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xg95t" Apr 16 22:16:08.769307 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.769265 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-b97qf"] Apr 16 22:16:08.773764 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.773732 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-b97qf" Apr 16 22:16:08.780352 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.780322 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 22:16:08.780352 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.780342 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-7f4jj\"" Apr 16 22:16:08.806352 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.805064 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-ddb44ff65-4kt8q"] Apr 16 22:16:08.808656 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.808546 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-b97qf"] Apr 16 22:16:08.808808 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.808740 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:08.814242 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.813785 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 22:16:08.814242 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.813825 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 22:16:08.814242 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.813855 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 22:16:08.814242 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.814071 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-kf4z8\"" Apr 16 22:16:08.814578 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.814553 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 22:16:08.815180 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.814748 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 22:16:08.821440 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.821414 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 22:16:08.834896 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.834842 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-ddb44ff65-4kt8q"] Apr 16 22:16:08.914916 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.914882 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/54b811b8-6501-4936-8da2-f9091a8042f0-console-serving-cert\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:08.915108 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.914940 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-console-config\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:08.915108 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.915025 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/085d2591-737c-4156-a33c-8c6f9b307ada-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-b97qf\" (UID: \"085d2591-737c-4156-a33c-8c6f9b307ada\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-b97qf" Apr 16 22:16:08.915108 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.915060 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-trusted-ca-bundle\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:08.915258 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.915155 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-oauth-serving-cert\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:08.915258 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.915191 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/54b811b8-6501-4936-8da2-f9091a8042f0-console-oauth-config\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:08.915363 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.915284 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc8g4\" (UniqueName: \"kubernetes.io/projected/54b811b8-6501-4936-8da2-f9091a8042f0-kube-api-access-cc8g4\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:08.915363 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:08.915321 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-service-ca\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:09.016441 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:09.016393 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cc8g4\" (UniqueName: \"kubernetes.io/projected/54b811b8-6501-4936-8da2-f9091a8042f0-kube-api-access-cc8g4\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:09.016696 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:09.016452 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-service-ca\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:09.016696 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:09.016495 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/54b811b8-6501-4936-8da2-f9091a8042f0-console-serving-cert\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:09.016696 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:09.016545 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-console-config\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:09.016696 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:09.016583 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/085d2591-737c-4156-a33c-8c6f9b307ada-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-b97qf\" (UID: \"085d2591-737c-4156-a33c-8c6f9b307ada\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-b97qf" Apr 16 22:16:09.016696 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:09.016632 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-trusted-ca-bundle\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:09.016696 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:09.016678 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-oauth-serving-cert\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:09.016696 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:09.016705 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/54b811b8-6501-4936-8da2-f9091a8042f0-console-oauth-config\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:09.017904 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:09.017810 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-console-config\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:09.017904 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:09.017846 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-service-ca\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:09.018492 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:09.018469 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-oauth-serving-cert\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:09.020264 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:09.020118 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/54b811b8-6501-4936-8da2-f9091a8042f0-console-serving-cert\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:09.020553 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:09.020515 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/54b811b8-6501-4936-8da2-f9091a8042f0-console-oauth-config\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:09.021519 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:09.021482 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-trusted-ca-bundle\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:09.023206 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:09.023156 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/085d2591-737c-4156-a33c-8c6f9b307ada-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-b97qf\" (UID: \"085d2591-737c-4156-a33c-8c6f9b307ada\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-b97qf" Apr 16 22:16:09.028419 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:09.028391 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc8g4\" (UniqueName: \"kubernetes.io/projected/54b811b8-6501-4936-8da2-f9091a8042f0-kube-api-access-cc8g4\") pod \"console-ddb44ff65-4kt8q\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:09.085227 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:09.085194 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-b97qf" Apr 16 22:16:09.120901 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:09.120862 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:10.609885 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:16:10.609845 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50c10cf9_25e9_4cb5_a882_5cb45721bda9.slice/crio-4d6e689939b0c9852a08b14074e4a9ba834a843fbc93241263419d77af079b9b WatchSource:0}: Error finding container 4d6e689939b0c9852a08b14074e4a9ba834a843fbc93241263419d77af079b9b: Status 404 returned error can't find the container with id 4d6e689939b0c9852a08b14074e4a9ba834a843fbc93241263419d77af079b9b Apr 16 22:16:10.792185 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:10.792006 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-ddb44ff65-4kt8q"] Apr 16 22:16:10.806762 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:16:10.806725 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54b811b8_6501_4936_8da2_f9091a8042f0.slice/crio-ad776579012e1ef088ccf0b8a7b630fdc69e76e7eac9475fdbfc2feae80d9ce5 WatchSource:0}: Error finding container ad776579012e1ef088ccf0b8a7b630fdc69e76e7eac9475fdbfc2feae80d9ce5: Status 404 returned error can't find the container with id ad776579012e1ef088ccf0b8a7b630fdc69e76e7eac9475fdbfc2feae80d9ce5 Apr 16 22:16:10.841684 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:10.841591 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:16:10.844831 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:16:10.844799 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod881f7b46_8c34_4d07_9ba4_ced81175999e.slice/crio-b9db9e3e608de811d26f02ce335df30413a976793c6a78c39540598c40e7a93b WatchSource:0}: Error finding container b9db9e3e608de811d26f02ce335df30413a976793c6a78c39540598c40e7a93b: Status 404 returned error can't find the container with id b9db9e3e608de811d26f02ce335df30413a976793c6a78c39540598c40e7a93b Apr 16 22:16:11.005678 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:11.005643 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xg95t"] Apr 16 22:16:11.008140 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:11.008095 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-b97qf"] Apr 16 22:16:11.008446 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:16:11.008416 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod262b53c6_89e8_4fcb_9d2d_6de1c03648ad.slice/crio-f2d362200e2e518e3ee452caef2507c19deaa0c9ae70362ef9c669e4700c6d5b WatchSource:0}: Error finding container f2d362200e2e518e3ee452caef2507c19deaa0c9ae70362ef9c669e4700c6d5b: Status 404 returned error can't find the container with id f2d362200e2e518e3ee452caef2507c19deaa0c9ae70362ef9c669e4700c6d5b Apr 16 22:16:11.012303 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:16:11.012266 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod085d2591_737c_4156_a33c_8c6f9b307ada.slice/crio-2495ffe55ceab681d7b23e9c40de7b5f8c88b7c3e20faf58b433f73398fb2196 WatchSource:0}: Error finding container 2495ffe55ceab681d7b23e9c40de7b5f8c88b7c3e20faf58b433f73398fb2196: Status 404 returned error can't find the container with id 2495ffe55ceab681d7b23e9c40de7b5f8c88b7c3e20faf58b433f73398fb2196 Apr 16 22:16:11.619946 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:11.617322 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-nc9dz" event={"ID":"92f991e3-505f-4a83-870d-ff28b1ed3fad","Type":"ContainerStarted","Data":"9f1ffd90bd14af156646cb0b8c3b8dd33eca4e49134fda88866aebc11afba5c3"} Apr 16 22:16:11.619946 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:11.618262 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-nc9dz" Apr 16 22:16:11.622494 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:11.620835 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-b97qf" event={"ID":"085d2591-737c-4156-a33c-8c6f9b307ada","Type":"ContainerStarted","Data":"2495ffe55ceab681d7b23e9c40de7b5f8c88b7c3e20faf58b433f73398fb2196"} Apr 16 22:16:11.622494 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:11.622083 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xg95t" event={"ID":"262b53c6-89e8-4fcb-9d2d-6de1c03648ad","Type":"ContainerStarted","Data":"f2d362200e2e518e3ee452caef2507c19deaa0c9ae70362ef9c669e4700c6d5b"} Apr 16 22:16:11.625404 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:11.624155 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"881f7b46-8c34-4d07-9ba4-ced81175999e","Type":"ContainerStarted","Data":"b9db9e3e608de811d26f02ce335df30413a976793c6a78c39540598c40e7a93b"} Apr 16 22:16:11.625734 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:11.625695 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ddb44ff65-4kt8q" event={"ID":"54b811b8-6501-4936-8da2-f9091a8042f0","Type":"ContainerStarted","Data":"ad776579012e1ef088ccf0b8a7b630fdc69e76e7eac9475fdbfc2feae80d9ce5"} Apr 16 22:16:11.627936 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:11.627846 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtdtz" event={"ID":"50c10cf9-25e9-4cb5-a882-5cb45721bda9","Type":"ContainerStarted","Data":"4d6e689939b0c9852a08b14074e4a9ba834a843fbc93241263419d77af079b9b"} Apr 16 22:16:11.636076 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:11.636022 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-nc9dz" podStartSLOduration=2.3333462 podStartE2EDuration="20.636003269s" podCreationTimestamp="2026-04-16 22:15:51 +0000 UTC" firstStartedPulling="2026-04-16 22:15:52.418465508 +0000 UTC m=+144.900984642" lastFinishedPulling="2026-04-16 22:16:10.721122583 +0000 UTC m=+163.203641711" observedRunningTime="2026-04-16 22:16:11.634745751 +0000 UTC m=+164.117264887" watchObservedRunningTime="2026-04-16 22:16:11.636003269 +0000 UTC m=+164.118522406" Apr 16 22:16:11.642148 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:11.642121 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-nc9dz" Apr 16 22:16:12.635395 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:12.635327 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"881f7b46-8c34-4d07-9ba4-ced81175999e","Type":"ContainerStarted","Data":"d58f2a280d7dc9a7c588de3e4753684260a16884d860820226c56e9cc37fe5d5"} Apr 16 22:16:12.638082 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:12.638034 2562 generic.go:358] "Generic (PLEG): container finished" podID="50c10cf9-25e9-4cb5-a882-5cb45721bda9" containerID="c80fc1a6aaa5e94bc847388206970d1311167549f513a661adf2bdb553e03222" exitCode=0 Apr 16 22:16:12.638398 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:12.638352 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtdtz" event={"ID":"50c10cf9-25e9-4cb5-a882-5cb45721bda9","Type":"ContainerDied","Data":"c80fc1a6aaa5e94bc847388206970d1311167549f513a661adf2bdb553e03222"} Apr 16 22:16:14.647387 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:14.647343 2562 generic.go:358] "Generic (PLEG): container finished" podID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerID="d58f2a280d7dc9a7c588de3e4753684260a16884d860820226c56e9cc37fe5d5" exitCode=0 Apr 16 22:16:14.647940 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:14.647426 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"881f7b46-8c34-4d07-9ba4-ced81175999e","Type":"ContainerDied","Data":"d58f2a280d7dc9a7c588de3e4753684260a16884d860820226c56e9cc37fe5d5"} Apr 16 22:16:15.652856 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:15.652799 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtdtz" event={"ID":"50c10cf9-25e9-4cb5-a882-5cb45721bda9","Type":"ContainerStarted","Data":"6d2d42c691055d417f4c1ac6d63332641e9549fe417a2b09f9c3c3d6afa3850b"} Apr 16 22:16:16.659339 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:16.659244 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-b97qf" event={"ID":"085d2591-737c-4156-a33c-8c6f9b307ada","Type":"ContainerStarted","Data":"d5626b1ff97bd46d68cd8f898a17b99c12a2d9a6c2fdd6bfc69db969272e1b63"} Apr 16 22:16:16.659904 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:16.659475 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-b97qf" Apr 16 22:16:16.661118 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:16.661068 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xg95t" event={"ID":"262b53c6-89e8-4fcb-9d2d-6de1c03648ad","Type":"ContainerStarted","Data":"f7742e8171ce97c705eef3225c73978c38b9b55519a0f717e7dd98cb69d38166"} Apr 16 22:16:16.662943 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:16.662911 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ddb44ff65-4kt8q" event={"ID":"54b811b8-6501-4936-8da2-f9091a8042f0","Type":"ContainerStarted","Data":"e68a7992dfc8c8dd48320513ca08bad3e5f88c56848ce22092fc2b95aada4165"} Apr 16 22:16:16.665147 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:16.665120 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-b97qf" Apr 16 22:16:16.665300 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:16.665268 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtdtz" event={"ID":"50c10cf9-25e9-4cb5-a882-5cb45721bda9","Type":"ContainerStarted","Data":"30b8faa9d67d9298cb2a346e2beb76e75676bc7cd58c374316c8d456256063e0"} Apr 16 22:16:16.675424 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:16.675364 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-b97qf" podStartSLOduration=4.151109893 podStartE2EDuration="8.675345305s" podCreationTimestamp="2026-04-16 22:16:08 +0000 UTC" firstStartedPulling="2026-04-16 22:16:11.01470035 +0000 UTC m=+163.497219464" lastFinishedPulling="2026-04-16 22:16:15.53893575 +0000 UTC m=+168.021454876" observedRunningTime="2026-04-16 22:16:16.674339603 +0000 UTC m=+169.156858740" watchObservedRunningTime="2026-04-16 22:16:16.675345305 +0000 UTC m=+169.157864449" Apr 16 22:16:16.689205 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:16.689150 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xg95t" podStartSLOduration=132.162066404 podStartE2EDuration="2m16.689137245s" podCreationTimestamp="2026-04-16 22:14:00 +0000 UTC" firstStartedPulling="2026-04-16 22:16:11.011092571 +0000 UTC m=+163.493611684" lastFinishedPulling="2026-04-16 22:16:15.538163402 +0000 UTC m=+168.020682525" observedRunningTime="2026-04-16 22:16:16.688628126 +0000 UTC m=+169.171147274" watchObservedRunningTime="2026-04-16 22:16:16.689137245 +0000 UTC m=+169.171656379" Apr 16 22:16:16.705440 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:16.705376 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-ddb44ff65-4kt8q" podStartSLOduration=3.9678890669999998 podStartE2EDuration="8.705355865s" podCreationTimestamp="2026-04-16 22:16:08 +0000 UTC" firstStartedPulling="2026-04-16 22:16:10.809227342 +0000 UTC m=+163.291746458" lastFinishedPulling="2026-04-16 22:16:15.546694135 +0000 UTC m=+168.029213256" observedRunningTime="2026-04-16 22:16:16.70454055 +0000 UTC m=+169.187059686" watchObservedRunningTime="2026-04-16 22:16:16.705355865 +0000 UTC m=+169.187874999" Apr 16 22:16:16.736661 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:16.736589 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-xtdtz" podStartSLOduration=12.795150779 podStartE2EDuration="13.736573486s" podCreationTimestamp="2026-04-16 22:16:03 +0000 UTC" firstStartedPulling="2026-04-16 22:16:10.613337001 +0000 UTC m=+163.095856129" lastFinishedPulling="2026-04-16 22:16:11.554759711 +0000 UTC m=+164.037278836" observedRunningTime="2026-04-16 22:16:16.734916681 +0000 UTC m=+169.217435831" watchObservedRunningTime="2026-04-16 22:16:16.736573486 +0000 UTC m=+169.219092620" Apr 16 22:16:17.125634 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:17.125591 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zprfg" Apr 16 22:16:17.125790 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:17.125656 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:16:17.128757 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:17.128724 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nlsdl\"" Apr 16 22:16:17.128916 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:17.128759 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-s46c2\"" Apr 16 22:16:17.136788 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:17.136744 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:16:17.136967 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:17.136766 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zprfg" Apr 16 22:16:17.296768 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:17.296681 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b85c8896b-qn2l2"] Apr 16 22:16:17.299738 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:16:17.299705 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5d30e65_e56d_4830_9544_0d047de3e6e6.slice/crio-6f36bbb45a7882b759daada08acd09880c018a019db22191187f414cf52286b3 WatchSource:0}: Error finding container 6f36bbb45a7882b759daada08acd09880c018a019db22191187f414cf52286b3: Status 404 returned error can't find the container with id 6f36bbb45a7882b759daada08acd09880c018a019db22191187f414cf52286b3 Apr 16 22:16:17.313782 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:17.313733 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zprfg"] Apr 16 22:16:17.317301 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:16:17.317273 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod896dfd17_7377_42ff_b2c1_0ff2bbb1909a.slice/crio-61a978afbd4f3ea7a1e8d146feaaaa39bd5c54b65526ca384da880f3232ba864 WatchSource:0}: Error finding container 61a978afbd4f3ea7a1e8d146feaaaa39bd5c54b65526ca384da880f3232ba864: Status 404 returned error can't find the container with id 61a978afbd4f3ea7a1e8d146feaaaa39bd5c54b65526ca384da880f3232ba864 Apr 16 22:16:17.672457 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:17.672418 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"881f7b46-8c34-4d07-9ba4-ced81175999e","Type":"ContainerStarted","Data":"c6f021ba83fe4a96897745626ba27d25af3a9e9897a8ce8e6da0e79bd236dfab"} Apr 16 22:16:17.672457 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:17.672462 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"881f7b46-8c34-4d07-9ba4-ced81175999e","Type":"ContainerStarted","Data":"5d5fbbc9f9e6ce11bff07c9d94087014efb3d2ca096d06ab6a330e8dc671ff78"} Apr 16 22:16:17.673009 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:17.672479 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"881f7b46-8c34-4d07-9ba4-ced81175999e","Type":"ContainerStarted","Data":"976095bd60021f5d5df7b3807257c472e0b6aa34368b7468cfbc462bd4cdb3be"} Apr 16 22:16:17.673009 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:17.672496 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"881f7b46-8c34-4d07-9ba4-ced81175999e","Type":"ContainerStarted","Data":"de3b057269118ea75a07cc88121b0e76cf177bf0425ce1b5b29ea1eb81b9fb90"} Apr 16 22:16:17.673009 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:17.672508 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"881f7b46-8c34-4d07-9ba4-ced81175999e","Type":"ContainerStarted","Data":"00529082f9cffcb8eb7dfe2e1606d2013fde1981377e3b31b47363ac7c75c207"} Apr 16 22:16:17.674162 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:17.674118 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" event={"ID":"f5d30e65-e56d-4830-9544-0d047de3e6e6","Type":"ContainerStarted","Data":"f6fe53641e1d6c83c254e50c06b57c5214410fd1deb2b8ef79505b75bc2354f5"} Apr 16 22:16:17.674162 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:17.674157 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" event={"ID":"f5d30e65-e56d-4830-9544-0d047de3e6e6","Type":"ContainerStarted","Data":"6f36bbb45a7882b759daada08acd09880c018a019db22191187f414cf52286b3"} Apr 16 22:16:17.674347 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:17.674247 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:16:17.675393 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:17.675364 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zprfg" event={"ID":"896dfd17-7377-42ff-b2c1-0ff2bbb1909a","Type":"ContainerStarted","Data":"61a978afbd4f3ea7a1e8d146feaaaa39bd5c54b65526ca384da880f3232ba864"} Apr 16 22:16:17.701320 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:17.701219 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" podStartSLOduration=149.701199305 podStartE2EDuration="2m29.701199305s" podCreationTimestamp="2026-04-16 22:13:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:16:17.700073729 +0000 UTC m=+170.182592865" watchObservedRunningTime="2026-04-16 22:16:17.701199305 +0000 UTC m=+170.183718440" Apr 16 22:16:19.121981 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:19.121935 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:19.122424 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:19.121997 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:19.123639 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:19.123569 2562 patch_prober.go:28] interesting pod/console-ddb44ff65-4kt8q container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.14:8443/health\": dial tcp 10.132.0.14:8443: connect: connection refused" start-of-body= Apr 16 22:16:19.123746 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:19.123661 2562 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-ddb44ff65-4kt8q" podUID="54b811b8-6501-4936-8da2-f9091a8042f0" containerName="console" probeResult="failure" output="Get \"https://10.132.0.14:8443/health\": dial tcp 10.132.0.14:8443: connect: connection refused" Apr 16 22:16:20.126441 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:20.126402 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:16:20.687447 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:20.687407 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zprfg" event={"ID":"896dfd17-7377-42ff-b2c1-0ff2bbb1909a","Type":"ContainerStarted","Data":"01abc4d75ba6be0c50b409a1670aab7870c17d62f858dab4fba5a95c3f028be9"} Apr 16 22:16:20.687664 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:20.687454 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zprfg" event={"ID":"896dfd17-7377-42ff-b2c1-0ff2bbb1909a","Type":"ContainerStarted","Data":"dabf2438bb15598b54cf0ad8125c63a0018696b36109054e2dcead66fa55ebec"} Apr 16 22:16:20.687664 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:20.687509 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-zprfg" Apr 16 22:16:20.690949 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:20.690917 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"881f7b46-8c34-4d07-9ba4-ced81175999e","Type":"ContainerStarted","Data":"9df271a57580b92b06d8bddaa7964cc26e2a20a1ce4558d8491a8c993890f5a0"} Apr 16 22:16:20.704351 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:20.704295 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zprfg" podStartSLOduration=138.4188555 podStartE2EDuration="2m20.704280584s" podCreationTimestamp="2026-04-16 22:14:00 +0000 UTC" firstStartedPulling="2026-04-16 22:16:17.319663391 +0000 UTC m=+169.802182517" lastFinishedPulling="2026-04-16 22:16:19.605088467 +0000 UTC m=+172.087607601" observedRunningTime="2026-04-16 22:16:20.703168501 +0000 UTC m=+173.185687636" watchObservedRunningTime="2026-04-16 22:16:20.704280584 +0000 UTC m=+173.186799752" Apr 16 22:16:20.730792 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:20.730731 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=6.97029539 podStartE2EDuration="15.730711563s" podCreationTimestamp="2026-04-16 22:16:05 +0000 UTC" firstStartedPulling="2026-04-16 22:16:10.84690743 +0000 UTC m=+163.329426547" lastFinishedPulling="2026-04-16 22:16:19.607323595 +0000 UTC m=+172.089842720" observedRunningTime="2026-04-16 22:16:20.729302854 +0000 UTC m=+173.211821988" watchObservedRunningTime="2026-04-16 22:16:20.730711563 +0000 UTC m=+173.213230698" Apr 16 22:16:29.121767 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:29.121729 2562 patch_prober.go:28] interesting pod/console-ddb44ff65-4kt8q container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.14:8443/health\": dial tcp 10.132.0.14:8443: connect: connection refused" start-of-body= Apr 16 22:16:29.122223 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:29.121787 2562 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-ddb44ff65-4kt8q" podUID="54b811b8-6501-4936-8da2-f9091a8042f0" containerName="console" probeResult="failure" output="Get \"https://10.132.0.14:8443/health\": dial tcp 10.132.0.14:8443: connect: connection refused" Apr 16 22:16:30.696751 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:30.696720 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zprfg" Apr 16 22:16:37.140439 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:37.140406 2562 patch_prober.go:28] interesting pod/image-registry-6b85c8896b-qn2l2 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 22:16:37.140833 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:37.140455 2562 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" podUID="f5d30e65-e56d-4830-9544-0d047de3e6e6" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:16:38.682271 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:38.682233 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:16:39.126059 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:39.126026 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:39.129850 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:39.129832 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:16:42.367486 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:42.367452 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6b85c8896b-qn2l2"] Apr 16 22:16:52.236985 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:16:52.236950 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xg95t_262b53c6-89e8-4fcb-9d2d-6de1c03648ad/serve-healthcheck-canary/0.log" Apr 16 22:17:01.348833 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:01.348776 2562 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" podUID="14c99cff-d122-4d8b-9021-5b7ae6333efa" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 22:17:07.389292 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.389238 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" podUID="f5d30e65-e56d-4830-9544-0d047de3e6e6" containerName="registry" containerID="cri-o://f6fe53641e1d6c83c254e50c06b57c5214410fd1deb2b8ef79505b75bc2354f5" gracePeriod=30 Apr 16 22:17:07.631990 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.631968 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:17:07.715824 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.715764 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls\") pod \"f5d30e65-e56d-4830-9544-0d047de3e6e6\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " Apr 16 22:17:07.715824 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.715805 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr4f7\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-kube-api-access-sr4f7\") pod \"f5d30e65-e56d-4830-9544-0d047de3e6e6\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " Apr 16 22:17:07.715824 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.715824 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-bound-sa-token\") pod \"f5d30e65-e56d-4830-9544-0d047de3e6e6\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " Apr 16 22:17:07.716036 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.715861 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-certificates\") pod \"f5d30e65-e56d-4830-9544-0d047de3e6e6\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " Apr 16 22:17:07.716036 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.715894 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5d30e65-e56d-4830-9544-0d047de3e6e6-image-registry-private-configuration\") pod \"f5d30e65-e56d-4830-9544-0d047de3e6e6\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " Apr 16 22:17:07.716036 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.715920 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5d30e65-e56d-4830-9544-0d047de3e6e6-installation-pull-secrets\") pod \"f5d30e65-e56d-4830-9544-0d047de3e6e6\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " Apr 16 22:17:07.716036 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.716011 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5d30e65-e56d-4830-9544-0d047de3e6e6-trusted-ca\") pod \"f5d30e65-e56d-4830-9544-0d047de3e6e6\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " Apr 16 22:17:07.716233 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.716050 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5d30e65-e56d-4830-9544-0d047de3e6e6-ca-trust-extracted\") pod \"f5d30e65-e56d-4830-9544-0d047de3e6e6\" (UID: \"f5d30e65-e56d-4830-9544-0d047de3e6e6\") " Apr 16 22:17:07.716409 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.716346 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f5d30e65-e56d-4830-9544-0d047de3e6e6" (UID: "f5d30e65-e56d-4830-9544-0d047de3e6e6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:07.716739 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.716709 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5d30e65-e56d-4830-9544-0d047de3e6e6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f5d30e65-e56d-4830-9544-0d047de3e6e6" (UID: "f5d30e65-e56d-4830-9544-0d047de3e6e6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:07.718187 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.718160 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f5d30e65-e56d-4830-9544-0d047de3e6e6" (UID: "f5d30e65-e56d-4830-9544-0d047de3e6e6"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:07.718320 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.718298 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d30e65-e56d-4830-9544-0d047de3e6e6-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "f5d30e65-e56d-4830-9544-0d047de3e6e6" (UID: "f5d30e65-e56d-4830-9544-0d047de3e6e6"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:07.718484 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.718439 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-kube-api-access-sr4f7" (OuterVolumeSpecName: "kube-api-access-sr4f7") pod "f5d30e65-e56d-4830-9544-0d047de3e6e6" (UID: "f5d30e65-e56d-4830-9544-0d047de3e6e6"). InnerVolumeSpecName "kube-api-access-sr4f7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:07.718484 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.718478 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d30e65-e56d-4830-9544-0d047de3e6e6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f5d30e65-e56d-4830-9544-0d047de3e6e6" (UID: "f5d30e65-e56d-4830-9544-0d047de3e6e6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:07.718635 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.718513 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f5d30e65-e56d-4830-9544-0d047de3e6e6" (UID: "f5d30e65-e56d-4830-9544-0d047de3e6e6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:07.724990 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.724961 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5d30e65-e56d-4830-9544-0d047de3e6e6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f5d30e65-e56d-4830-9544-0d047de3e6e6" (UID: "f5d30e65-e56d-4830-9544-0d047de3e6e6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:17:07.816906 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.816880 2562 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-certificates\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:07.817005 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.816910 2562 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f5d30e65-e56d-4830-9544-0d047de3e6e6-image-registry-private-configuration\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:07.817005 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.816926 2562 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f5d30e65-e56d-4830-9544-0d047de3e6e6-installation-pull-secrets\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:07.817005 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.816940 2562 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5d30e65-e56d-4830-9544-0d047de3e6e6-trusted-ca\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:07.817005 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.816949 2562 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f5d30e65-e56d-4830-9544-0d047de3e6e6-ca-trust-extracted\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:07.817005 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.816958 2562 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-registry-tls\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:07.817005 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.816973 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sr4f7\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-kube-api-access-sr4f7\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:07.817005 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.816982 2562 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f5d30e65-e56d-4830-9544-0d047de3e6e6-bound-sa-token\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:07.820337 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.820315 2562 generic.go:358] "Generic (PLEG): container finished" podID="f5d30e65-e56d-4830-9544-0d047de3e6e6" containerID="f6fe53641e1d6c83c254e50c06b57c5214410fd1deb2b8ef79505b75bc2354f5" exitCode=0 Apr 16 22:17:07.820428 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.820372 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" Apr 16 22:17:07.820428 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.820376 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" event={"ID":"f5d30e65-e56d-4830-9544-0d047de3e6e6","Type":"ContainerDied","Data":"f6fe53641e1d6c83c254e50c06b57c5214410fd1deb2b8ef79505b75bc2354f5"} Apr 16 22:17:07.820428 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.820404 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b85c8896b-qn2l2" event={"ID":"f5d30e65-e56d-4830-9544-0d047de3e6e6","Type":"ContainerDied","Data":"6f36bbb45a7882b759daada08acd09880c018a019db22191187f414cf52286b3"} Apr 16 22:17:07.820428 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.820419 2562 scope.go:117] "RemoveContainer" containerID="f6fe53641e1d6c83c254e50c06b57c5214410fd1deb2b8ef79505b75bc2354f5" Apr 16 22:17:07.829510 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.829494 2562 scope.go:117] "RemoveContainer" containerID="f6fe53641e1d6c83c254e50c06b57c5214410fd1deb2b8ef79505b75bc2354f5" Apr 16 22:17:07.829801 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:17:07.829778 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6fe53641e1d6c83c254e50c06b57c5214410fd1deb2b8ef79505b75bc2354f5\": container with ID starting with f6fe53641e1d6c83c254e50c06b57c5214410fd1deb2b8ef79505b75bc2354f5 not found: ID does not exist" containerID="f6fe53641e1d6c83c254e50c06b57c5214410fd1deb2b8ef79505b75bc2354f5" Apr 16 22:17:07.829877 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.829813 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6fe53641e1d6c83c254e50c06b57c5214410fd1deb2b8ef79505b75bc2354f5"} err="failed to get container status \"f6fe53641e1d6c83c254e50c06b57c5214410fd1deb2b8ef79505b75bc2354f5\": rpc error: code = NotFound desc = could not find container \"f6fe53641e1d6c83c254e50c06b57c5214410fd1deb2b8ef79505b75bc2354f5\": container with ID starting with f6fe53641e1d6c83c254e50c06b57c5214410fd1deb2b8ef79505b75bc2354f5 not found: ID does not exist" Apr 16 22:17:07.843636 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.843593 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6b85c8896b-qn2l2"] Apr 16 22:17:07.846683 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:07.846661 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6b85c8896b-qn2l2"] Apr 16 22:17:08.129834 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:08.129800 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5d30e65-e56d-4830-9544-0d047de3e6e6" path="/var/lib/kubelet/pods/f5d30e65-e56d-4830-9544-0d047de3e6e6/volumes" Apr 16 22:17:11.348533 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:11.348491 2562 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" podUID="14c99cff-d122-4d8b-9021-5b7ae6333efa" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 22:17:21.348923 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:21.348879 2562 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" podUID="14c99cff-d122-4d8b-9021-5b7ae6333efa" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 22:17:21.349324 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:21.348943 2562 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" Apr 16 22:17:21.349422 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:21.349403 2562 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"c759604321624b9cf86d932f83ca22f774784314620604b61e11899e331aa42e"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 22:17:21.349464 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:21.349443 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" podUID="14c99cff-d122-4d8b-9021-5b7ae6333efa" containerName="service-proxy" containerID="cri-o://c759604321624b9cf86d932f83ca22f774784314620604b61e11899e331aa42e" gracePeriod=30 Apr 16 22:17:21.867531 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:21.867497 2562 generic.go:358] "Generic (PLEG): container finished" podID="14c99cff-d122-4d8b-9021-5b7ae6333efa" containerID="c759604321624b9cf86d932f83ca22f774784314620604b61e11899e331aa42e" exitCode=2 Apr 16 22:17:21.867692 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:21.867565 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" event={"ID":"14c99cff-d122-4d8b-9021-5b7ae6333efa","Type":"ContainerDied","Data":"c759604321624b9cf86d932f83ca22f774784314620604b61e11899e331aa42e"} Apr 16 22:17:21.867692 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:21.867627 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6944d7fd56-cm7bl" event={"ID":"14c99cff-d122-4d8b-9021-5b7ae6333efa","Type":"ContainerStarted","Data":"56e6406ff132185d51b076e97a955ed86751e3b4cc736b9f15cd07aec8c45617"} Apr 16 22:17:24.362409 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:24.362375 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:17:24.362822 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:24.362799 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="alertmanager" containerID="cri-o://00529082f9cffcb8eb7dfe2e1606d2013fde1981377e3b31b47363ac7c75c207" gracePeriod=120 Apr 16 22:17:24.362875 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:24.362860 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="kube-rbac-proxy-metric" containerID="cri-o://c6f021ba83fe4a96897745626ba27d25af3a9e9897a8ce8e6da0e79bd236dfab" gracePeriod=120 Apr 16 22:17:24.362926 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:24.362892 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="kube-rbac-proxy-web" containerID="cri-o://976095bd60021f5d5df7b3807257c472e0b6aa34368b7468cfbc462bd4cdb3be" gracePeriod=120 Apr 16 22:17:24.362974 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:24.362930 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="kube-rbac-proxy" containerID="cri-o://5d5fbbc9f9e6ce11bff07c9d94087014efb3d2ca096d06ab6a330e8dc671ff78" gracePeriod=120 Apr 16 22:17:24.363021 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:24.362908 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="prom-label-proxy" containerID="cri-o://9df271a57580b92b06d8bddaa7964cc26e2a20a1ce4558d8491a8c993890f5a0" gracePeriod=120 Apr 16 22:17:24.363021 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:24.362955 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="config-reloader" containerID="cri-o://de3b057269118ea75a07cc88121b0e76cf177bf0425ce1b5b29ea1eb81b9fb90" gracePeriod=120 Apr 16 22:17:24.881354 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:24.881321 2562 generic.go:358] "Generic (PLEG): container finished" podID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerID="9df271a57580b92b06d8bddaa7964cc26e2a20a1ce4558d8491a8c993890f5a0" exitCode=0 Apr 16 22:17:24.881354 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:24.881347 2562 generic.go:358] "Generic (PLEG): container finished" podID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerID="c6f021ba83fe4a96897745626ba27d25af3a9e9897a8ce8e6da0e79bd236dfab" exitCode=0 Apr 16 22:17:24.881354 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:24.881353 2562 generic.go:358] "Generic (PLEG): container finished" podID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerID="5d5fbbc9f9e6ce11bff07c9d94087014efb3d2ca096d06ab6a330e8dc671ff78" exitCode=0 Apr 16 22:17:24.881354 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:24.881359 2562 generic.go:358] "Generic (PLEG): container finished" podID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerID="de3b057269118ea75a07cc88121b0e76cf177bf0425ce1b5b29ea1eb81b9fb90" exitCode=0 Apr 16 22:17:24.881354 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:24.881364 2562 generic.go:358] "Generic (PLEG): container finished" podID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerID="00529082f9cffcb8eb7dfe2e1606d2013fde1981377e3b31b47363ac7c75c207" exitCode=0 Apr 16 22:17:24.881663 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:24.881391 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"881f7b46-8c34-4d07-9ba4-ced81175999e","Type":"ContainerDied","Data":"9df271a57580b92b06d8bddaa7964cc26e2a20a1ce4558d8491a8c993890f5a0"} Apr 16 22:17:24.881663 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:24.881424 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"881f7b46-8c34-4d07-9ba4-ced81175999e","Type":"ContainerDied","Data":"c6f021ba83fe4a96897745626ba27d25af3a9e9897a8ce8e6da0e79bd236dfab"} Apr 16 22:17:24.881663 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:24.881434 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"881f7b46-8c34-4d07-9ba4-ced81175999e","Type":"ContainerDied","Data":"5d5fbbc9f9e6ce11bff07c9d94087014efb3d2ca096d06ab6a330e8dc671ff78"} Apr 16 22:17:24.881663 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:24.881443 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"881f7b46-8c34-4d07-9ba4-ced81175999e","Type":"ContainerDied","Data":"de3b057269118ea75a07cc88121b0e76cf177bf0425ce1b5b29ea1eb81b9fb90"} Apr 16 22:17:24.881663 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:24.881452 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"881f7b46-8c34-4d07-9ba4-ced81175999e","Type":"ContainerDied","Data":"00529082f9cffcb8eb7dfe2e1606d2013fde1981377e3b31b47363ac7c75c207"} Apr 16 22:17:25.591570 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.591549 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:25.634582 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.634555 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/881f7b46-8c34-4d07-9ba4-ced81175999e-metrics-client-ca\") pod \"881f7b46-8c34-4d07-9ba4-ced81175999e\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " Apr 16 22:17:25.634756 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.634586 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-config-volume\") pod \"881f7b46-8c34-4d07-9ba4-ced81175999e\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " Apr 16 22:17:25.634756 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.634628 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/881f7b46-8c34-4d07-9ba4-ced81175999e-alertmanager-main-db\") pod \"881f7b46-8c34-4d07-9ba4-ced81175999e\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " Apr 16 22:17:25.634756 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.634655 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-cluster-tls-config\") pod \"881f7b46-8c34-4d07-9ba4-ced81175999e\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " Apr 16 22:17:25.634921 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.634837 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-kube-rbac-proxy-web\") pod \"881f7b46-8c34-4d07-9ba4-ced81175999e\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " Apr 16 22:17:25.634921 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.634886 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbldp\" (UniqueName: \"kubernetes.io/projected/881f7b46-8c34-4d07-9ba4-ced81175999e-kube-api-access-cbldp\") pod \"881f7b46-8c34-4d07-9ba4-ced81175999e\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " Apr 16 22:17:25.635028 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.634928 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-web-config\") pod \"881f7b46-8c34-4d07-9ba4-ced81175999e\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " Apr 16 22:17:25.635028 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.634962 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/881f7b46-8c34-4d07-9ba4-ced81175999e-alertmanager-trusted-ca-bundle\") pod \"881f7b46-8c34-4d07-9ba4-ced81175999e\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " Apr 16 22:17:25.635028 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.634977 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/881f7b46-8c34-4d07-9ba4-ced81175999e-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "881f7b46-8c34-4d07-9ba4-ced81175999e" (UID: "881f7b46-8c34-4d07-9ba4-ced81175999e"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:17:25.635028 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.634993 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/881f7b46-8c34-4d07-9ba4-ced81175999e-config-out\") pod \"881f7b46-8c34-4d07-9ba4-ced81175999e\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " Apr 16 22:17:25.635028 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.634991 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/881f7b46-8c34-4d07-9ba4-ced81175999e-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "881f7b46-8c34-4d07-9ba4-ced81175999e" (UID: "881f7b46-8c34-4d07-9ba4-ced81175999e"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:25.635028 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.635021 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-kube-rbac-proxy\") pod \"881f7b46-8c34-4d07-9ba4-ced81175999e\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " Apr 16 22:17:25.635333 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.635098 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"881f7b46-8c34-4d07-9ba4-ced81175999e\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " Apr 16 22:17:25.635333 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.635125 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/881f7b46-8c34-4d07-9ba4-ced81175999e-tls-assets\") pod \"881f7b46-8c34-4d07-9ba4-ced81175999e\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " Apr 16 22:17:25.635333 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.635154 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-main-tls\") pod \"881f7b46-8c34-4d07-9ba4-ced81175999e\" (UID: \"881f7b46-8c34-4d07-9ba4-ced81175999e\") " Apr 16 22:17:25.635479 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.635378 2562 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/881f7b46-8c34-4d07-9ba4-ced81175999e-metrics-client-ca\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:25.635479 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.635397 2562 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/881f7b46-8c34-4d07-9ba4-ced81175999e-alertmanager-main-db\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:25.637437 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.637407 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/881f7b46-8c34-4d07-9ba4-ced81175999e-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "881f7b46-8c34-4d07-9ba4-ced81175999e" (UID: "881f7b46-8c34-4d07-9ba4-ced81175999e"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:25.637816 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.637790 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-config-volume" (OuterVolumeSpecName: "config-volume") pod "881f7b46-8c34-4d07-9ba4-ced81175999e" (UID: "881f7b46-8c34-4d07-9ba4-ced81175999e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:25.638278 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.638242 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/881f7b46-8c34-4d07-9ba4-ced81175999e-kube-api-access-cbldp" (OuterVolumeSpecName: "kube-api-access-cbldp") pod "881f7b46-8c34-4d07-9ba4-ced81175999e" (UID: "881f7b46-8c34-4d07-9ba4-ced81175999e"). InnerVolumeSpecName "kube-api-access-cbldp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:25.638539 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.638513 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "881f7b46-8c34-4d07-9ba4-ced81175999e" (UID: "881f7b46-8c34-4d07-9ba4-ced81175999e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:25.639726 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.639700 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/881f7b46-8c34-4d07-9ba4-ced81175999e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "881f7b46-8c34-4d07-9ba4-ced81175999e" (UID: "881f7b46-8c34-4d07-9ba4-ced81175999e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:25.640279 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.640239 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "881f7b46-8c34-4d07-9ba4-ced81175999e" (UID: "881f7b46-8c34-4d07-9ba4-ced81175999e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:25.640377 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.640279 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "881f7b46-8c34-4d07-9ba4-ced81175999e" (UID: "881f7b46-8c34-4d07-9ba4-ced81175999e"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:25.640834 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.640805 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "881f7b46-8c34-4d07-9ba4-ced81175999e" (UID: "881f7b46-8c34-4d07-9ba4-ced81175999e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:25.641859 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.641838 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/881f7b46-8c34-4d07-9ba4-ced81175999e-config-out" (OuterVolumeSpecName: "config-out") pod "881f7b46-8c34-4d07-9ba4-ced81175999e" (UID: "881f7b46-8c34-4d07-9ba4-ced81175999e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:17:25.642277 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.642217 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "881f7b46-8c34-4d07-9ba4-ced81175999e" (UID: "881f7b46-8c34-4d07-9ba4-ced81175999e"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:25.648935 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.648909 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-web-config" (OuterVolumeSpecName: "web-config") pod "881f7b46-8c34-4d07-9ba4-ced81175999e" (UID: "881f7b46-8c34-4d07-9ba4-ced81175999e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:25.736397 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.736325 2562 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-web-config\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:25.736397 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.736351 2562 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/881f7b46-8c34-4d07-9ba4-ced81175999e-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:25.736397 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.736362 2562 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/881f7b46-8c34-4d07-9ba4-ced81175999e-config-out\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:25.736397 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.736371 2562 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:25.736397 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.736380 2562 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:25.736397 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.736403 2562 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/881f7b46-8c34-4d07-9ba4-ced81175999e-tls-assets\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:25.736676 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.736412 2562 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-main-tls\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:25.736676 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.736421 2562 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-config-volume\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:25.736676 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.736429 2562 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-cluster-tls-config\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:25.736676 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.736438 2562 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/881f7b46-8c34-4d07-9ba4-ced81175999e-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:25.736676 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.736448 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cbldp\" (UniqueName: \"kubernetes.io/projected/881f7b46-8c34-4d07-9ba4-ced81175999e-kube-api-access-cbldp\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:17:25.887078 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.887041 2562 generic.go:358] "Generic (PLEG): container finished" podID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerID="976095bd60021f5d5df7b3807257c472e0b6aa34368b7468cfbc462bd4cdb3be" exitCode=0 Apr 16 22:17:25.887209 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.887099 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"881f7b46-8c34-4d07-9ba4-ced81175999e","Type":"ContainerDied","Data":"976095bd60021f5d5df7b3807257c472e0b6aa34368b7468cfbc462bd4cdb3be"} Apr 16 22:17:25.887209 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.887134 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"881f7b46-8c34-4d07-9ba4-ced81175999e","Type":"ContainerDied","Data":"b9db9e3e608de811d26f02ce335df30413a976793c6a78c39540598c40e7a93b"} Apr 16 22:17:25.887209 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.887158 2562 scope.go:117] "RemoveContainer" containerID="9df271a57580b92b06d8bddaa7964cc26e2a20a1ce4558d8491a8c993890f5a0" Apr 16 22:17:25.887209 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.887164 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:25.893963 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.893947 2562 scope.go:117] "RemoveContainer" containerID="c6f021ba83fe4a96897745626ba27d25af3a9e9897a8ce8e6da0e79bd236dfab" Apr 16 22:17:25.900814 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.900798 2562 scope.go:117] "RemoveContainer" containerID="5d5fbbc9f9e6ce11bff07c9d94087014efb3d2ca096d06ab6a330e8dc671ff78" Apr 16 22:17:25.906494 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.906474 2562 scope.go:117] "RemoveContainer" containerID="976095bd60021f5d5df7b3807257c472e0b6aa34368b7468cfbc462bd4cdb3be" Apr 16 22:17:25.910360 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.910341 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:17:25.913283 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.913268 2562 scope.go:117] "RemoveContainer" containerID="de3b057269118ea75a07cc88121b0e76cf177bf0425ce1b5b29ea1eb81b9fb90" Apr 16 22:17:25.915814 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.915794 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:17:25.919367 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.919352 2562 scope.go:117] "RemoveContainer" containerID="00529082f9cffcb8eb7dfe2e1606d2013fde1981377e3b31b47363ac7c75c207" Apr 16 22:17:25.925080 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.925064 2562 scope.go:117] "RemoveContainer" containerID="d58f2a280d7dc9a7c588de3e4753684260a16884d860820226c56e9cc37fe5d5" Apr 16 22:17:25.931141 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.931116 2562 scope.go:117] "RemoveContainer" containerID="9df271a57580b92b06d8bddaa7964cc26e2a20a1ce4558d8491a8c993890f5a0" Apr 16 22:17:25.931435 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:17:25.931357 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9df271a57580b92b06d8bddaa7964cc26e2a20a1ce4558d8491a8c993890f5a0\": container with ID starting with 9df271a57580b92b06d8bddaa7964cc26e2a20a1ce4558d8491a8c993890f5a0 not found: ID does not exist" containerID="9df271a57580b92b06d8bddaa7964cc26e2a20a1ce4558d8491a8c993890f5a0" Apr 16 22:17:25.931501 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.931445 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9df271a57580b92b06d8bddaa7964cc26e2a20a1ce4558d8491a8c993890f5a0"} err="failed to get container status \"9df271a57580b92b06d8bddaa7964cc26e2a20a1ce4558d8491a8c993890f5a0\": rpc error: code = NotFound desc = could not find container \"9df271a57580b92b06d8bddaa7964cc26e2a20a1ce4558d8491a8c993890f5a0\": container with ID starting with 9df271a57580b92b06d8bddaa7964cc26e2a20a1ce4558d8491a8c993890f5a0 not found: ID does not exist" Apr 16 22:17:25.931501 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.931473 2562 scope.go:117] "RemoveContainer" containerID="c6f021ba83fe4a96897745626ba27d25af3a9e9897a8ce8e6da0e79bd236dfab" Apr 16 22:17:25.931734 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:17:25.931717 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6f021ba83fe4a96897745626ba27d25af3a9e9897a8ce8e6da0e79bd236dfab\": container with ID starting with c6f021ba83fe4a96897745626ba27d25af3a9e9897a8ce8e6da0e79bd236dfab not found: ID does not exist" containerID="c6f021ba83fe4a96897745626ba27d25af3a9e9897a8ce8e6da0e79bd236dfab" Apr 16 22:17:25.931782 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.931740 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6f021ba83fe4a96897745626ba27d25af3a9e9897a8ce8e6da0e79bd236dfab"} err="failed to get container status \"c6f021ba83fe4a96897745626ba27d25af3a9e9897a8ce8e6da0e79bd236dfab\": rpc error: code = NotFound desc = could not find container \"c6f021ba83fe4a96897745626ba27d25af3a9e9897a8ce8e6da0e79bd236dfab\": container with ID starting with c6f021ba83fe4a96897745626ba27d25af3a9e9897a8ce8e6da0e79bd236dfab not found: ID does not exist" Apr 16 22:17:25.931782 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.931755 2562 scope.go:117] "RemoveContainer" containerID="5d5fbbc9f9e6ce11bff07c9d94087014efb3d2ca096d06ab6a330e8dc671ff78" Apr 16 22:17:25.931984 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:17:25.931966 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d5fbbc9f9e6ce11bff07c9d94087014efb3d2ca096d06ab6a330e8dc671ff78\": container with ID starting with 5d5fbbc9f9e6ce11bff07c9d94087014efb3d2ca096d06ab6a330e8dc671ff78 not found: ID does not exist" containerID="5d5fbbc9f9e6ce11bff07c9d94087014efb3d2ca096d06ab6a330e8dc671ff78" Apr 16 22:17:25.932046 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.931991 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d5fbbc9f9e6ce11bff07c9d94087014efb3d2ca096d06ab6a330e8dc671ff78"} err="failed to get container status \"5d5fbbc9f9e6ce11bff07c9d94087014efb3d2ca096d06ab6a330e8dc671ff78\": rpc error: code = NotFound desc = could not find container \"5d5fbbc9f9e6ce11bff07c9d94087014efb3d2ca096d06ab6a330e8dc671ff78\": container with ID starting with 5d5fbbc9f9e6ce11bff07c9d94087014efb3d2ca096d06ab6a330e8dc671ff78 not found: ID does not exist" Apr 16 22:17:25.932046 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.932013 2562 scope.go:117] "RemoveContainer" containerID="976095bd60021f5d5df7b3807257c472e0b6aa34368b7468cfbc462bd4cdb3be" Apr 16 22:17:25.932257 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:17:25.932231 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"976095bd60021f5d5df7b3807257c472e0b6aa34368b7468cfbc462bd4cdb3be\": container with ID starting with 976095bd60021f5d5df7b3807257c472e0b6aa34368b7468cfbc462bd4cdb3be not found: ID does not exist" containerID="976095bd60021f5d5df7b3807257c472e0b6aa34368b7468cfbc462bd4cdb3be" Apr 16 22:17:25.932298 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.932255 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"976095bd60021f5d5df7b3807257c472e0b6aa34368b7468cfbc462bd4cdb3be"} err="failed to get container status \"976095bd60021f5d5df7b3807257c472e0b6aa34368b7468cfbc462bd4cdb3be\": rpc error: code = NotFound desc = could not find container \"976095bd60021f5d5df7b3807257c472e0b6aa34368b7468cfbc462bd4cdb3be\": container with ID starting with 976095bd60021f5d5df7b3807257c472e0b6aa34368b7468cfbc462bd4cdb3be not found: ID does not exist" Apr 16 22:17:25.932298 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.932270 2562 scope.go:117] "RemoveContainer" containerID="de3b057269118ea75a07cc88121b0e76cf177bf0425ce1b5b29ea1eb81b9fb90" Apr 16 22:17:25.932513 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:17:25.932497 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de3b057269118ea75a07cc88121b0e76cf177bf0425ce1b5b29ea1eb81b9fb90\": container with ID starting with de3b057269118ea75a07cc88121b0e76cf177bf0425ce1b5b29ea1eb81b9fb90 not found: ID does not exist" containerID="de3b057269118ea75a07cc88121b0e76cf177bf0425ce1b5b29ea1eb81b9fb90" Apr 16 22:17:25.932570 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.932522 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de3b057269118ea75a07cc88121b0e76cf177bf0425ce1b5b29ea1eb81b9fb90"} err="failed to get container status \"de3b057269118ea75a07cc88121b0e76cf177bf0425ce1b5b29ea1eb81b9fb90\": rpc error: code = NotFound desc = could not find container \"de3b057269118ea75a07cc88121b0e76cf177bf0425ce1b5b29ea1eb81b9fb90\": container with ID starting with de3b057269118ea75a07cc88121b0e76cf177bf0425ce1b5b29ea1eb81b9fb90 not found: ID does not exist" Apr 16 22:17:25.932570 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.932543 2562 scope.go:117] "RemoveContainer" containerID="00529082f9cffcb8eb7dfe2e1606d2013fde1981377e3b31b47363ac7c75c207" Apr 16 22:17:25.932785 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:17:25.932768 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00529082f9cffcb8eb7dfe2e1606d2013fde1981377e3b31b47363ac7c75c207\": container with ID starting with 00529082f9cffcb8eb7dfe2e1606d2013fde1981377e3b31b47363ac7c75c207 not found: ID does not exist" containerID="00529082f9cffcb8eb7dfe2e1606d2013fde1981377e3b31b47363ac7c75c207" Apr 16 22:17:25.932831 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.932789 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00529082f9cffcb8eb7dfe2e1606d2013fde1981377e3b31b47363ac7c75c207"} err="failed to get container status \"00529082f9cffcb8eb7dfe2e1606d2013fde1981377e3b31b47363ac7c75c207\": rpc error: code = NotFound desc = could not find container \"00529082f9cffcb8eb7dfe2e1606d2013fde1981377e3b31b47363ac7c75c207\": container with ID starting with 00529082f9cffcb8eb7dfe2e1606d2013fde1981377e3b31b47363ac7c75c207 not found: ID does not exist" Apr 16 22:17:25.932831 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.932802 2562 scope.go:117] "RemoveContainer" containerID="d58f2a280d7dc9a7c588de3e4753684260a16884d860820226c56e9cc37fe5d5" Apr 16 22:17:25.932995 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:17:25.932979 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d58f2a280d7dc9a7c588de3e4753684260a16884d860820226c56e9cc37fe5d5\": container with ID starting with d58f2a280d7dc9a7c588de3e4753684260a16884d860820226c56e9cc37fe5d5 not found: ID does not exist" containerID="d58f2a280d7dc9a7c588de3e4753684260a16884d860820226c56e9cc37fe5d5" Apr 16 22:17:25.933038 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.932998 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d58f2a280d7dc9a7c588de3e4753684260a16884d860820226c56e9cc37fe5d5"} err="failed to get container status \"d58f2a280d7dc9a7c588de3e4753684260a16884d860820226c56e9cc37fe5d5\": rpc error: code = NotFound desc = could not find container \"d58f2a280d7dc9a7c588de3e4753684260a16884d860820226c56e9cc37fe5d5\": container with ID starting with d58f2a280d7dc9a7c588de3e4753684260a16884d860820226c56e9cc37fe5d5 not found: ID does not exist" Apr 16 22:17:25.939240 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939220 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:17:25.939458 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939446 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="kube-rbac-proxy" Apr 16 22:17:25.939498 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939459 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="kube-rbac-proxy" Apr 16 22:17:25.939498 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939468 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="alertmanager" Apr 16 22:17:25.939498 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939474 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="alertmanager" Apr 16 22:17:25.939498 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939479 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="config-reloader" Apr 16 22:17:25.939498 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939485 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="config-reloader" Apr 16 22:17:25.939498 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939491 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="kube-rbac-proxy-web" Apr 16 22:17:25.939498 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939496 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="kube-rbac-proxy-web" Apr 16 22:17:25.939773 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939502 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="kube-rbac-proxy-metric" Apr 16 22:17:25.939773 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939508 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="kube-rbac-proxy-metric" Apr 16 22:17:25.939773 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939517 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5d30e65-e56d-4830-9544-0d047de3e6e6" containerName="registry" Apr 16 22:17:25.939773 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939523 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5d30e65-e56d-4830-9544-0d047de3e6e6" containerName="registry" Apr 16 22:17:25.939773 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939533 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="prom-label-proxy" Apr 16 22:17:25.939773 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939540 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="prom-label-proxy" Apr 16 22:17:25.939773 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939560 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="init-config-reloader" Apr 16 22:17:25.939773 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939566 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="init-config-reloader" Apr 16 22:17:25.939773 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939620 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="config-reloader" Apr 16 22:17:25.939773 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939628 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5d30e65-e56d-4830-9544-0d047de3e6e6" containerName="registry" Apr 16 22:17:25.939773 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939637 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="kube-rbac-proxy-web" Apr 16 22:17:25.939773 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939644 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="alertmanager" Apr 16 22:17:25.939773 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939650 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="kube-rbac-proxy" Apr 16 22:17:25.939773 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939655 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="kube-rbac-proxy-metric" Apr 16 22:17:25.939773 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.939662 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" containerName="prom-label-proxy" Apr 16 22:17:25.943758 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.943743 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:25.946055 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.946037 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 22:17:25.946134 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.946041 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 22:17:25.946192 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.946044 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 22:17:25.946283 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.946265 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 22:17:25.946335 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.946265 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 22:17:25.946557 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.946542 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 22:17:25.946670 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.946655 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-8thgp\"" Apr 16 22:17:25.946774 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.946756 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 22:17:25.947046 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.947031 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 22:17:25.951413 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.951230 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 22:17:25.958410 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:25.958364 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:17:26.039517 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.039491 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-config-volume\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.039665 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.039528 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.039665 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.039557 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-web-config\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.039665 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.039597 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6s56\" (UniqueName: \"kubernetes.io/projected/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-kube-api-access-p6s56\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.039665 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.039642 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.039665 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.039664 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.039875 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.039685 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.039875 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.039700 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.039875 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.039719 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.039875 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.039734 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.039875 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.039750 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.039875 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.039819 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.039875 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.039848 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-config-out\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.129962 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.129933 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="881f7b46-8c34-4d07-9ba4-ced81175999e" path="/var/lib/kubelet/pods/881f7b46-8c34-4d07-9ba4-ced81175999e/volumes" Apr 16 22:17:26.141017 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.140997 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.141125 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.141036 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.141125 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.141057 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-config-out\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.141125 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.141084 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-config-volume\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.141125 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.141115 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.141309 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.141144 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-web-config\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.141309 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.141173 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6s56\" (UniqueName: \"kubernetes.io/projected/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-kube-api-access-p6s56\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.141463 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.141307 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.141463 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.141350 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.141463 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.141394 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.141463 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.141421 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.141463 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.141452 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.141737 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.141477 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.141852 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.141833 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.142486 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.142463 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.143923 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.143903 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-config-out\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.144133 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.144107 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.144219 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.144198 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.144274 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.144235 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-config-volume\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.144324 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.144306 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.144505 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.144485 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.144561 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.144547 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-web-config\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.144940 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.144920 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.145018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.145005 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.146041 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.146025 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.149205 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.149186 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6s56\" (UniqueName: \"kubernetes.io/projected/f60918ff-7aa3-4049-8c22-166f6ccb9eaf-kube-api-access-p6s56\") pod \"alertmanager-main-0\" (UID: \"f60918ff-7aa3-4049-8c22-166f6ccb9eaf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.272213 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.272181 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 22:17:26.394212 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.394157 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 22:17:26.397757 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:17:26.397721 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf60918ff_7aa3_4049_8c22_166f6ccb9eaf.slice/crio-e7fd60dd7ec0ab77f7b02bfa35f029591c2c2506d58dae5ac2d19e6d5f1cc427 WatchSource:0}: Error finding container e7fd60dd7ec0ab77f7b02bfa35f029591c2c2506d58dae5ac2d19e6d5f1cc427: Status 404 returned error can't find the container with id e7fd60dd7ec0ab77f7b02bfa35f029591c2c2506d58dae5ac2d19e6d5f1cc427 Apr 16 22:17:26.891909 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.891867 2562 generic.go:358] "Generic (PLEG): container finished" podID="f60918ff-7aa3-4049-8c22-166f6ccb9eaf" containerID="9fce5e80389a6f79892392cb609adfbb8014c5e8fb5ae1220473055817fc402e" exitCode=0 Apr 16 22:17:26.892360 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.891950 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f60918ff-7aa3-4049-8c22-166f6ccb9eaf","Type":"ContainerDied","Data":"9fce5e80389a6f79892392cb609adfbb8014c5e8fb5ae1220473055817fc402e"} Apr 16 22:17:26.892360 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:26.891989 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f60918ff-7aa3-4049-8c22-166f6ccb9eaf","Type":"ContainerStarted","Data":"e7fd60dd7ec0ab77f7b02bfa35f029591c2c2506d58dae5ac2d19e6d5f1cc427"} Apr 16 22:17:27.898391 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:27.898352 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f60918ff-7aa3-4049-8c22-166f6ccb9eaf","Type":"ContainerStarted","Data":"5027e110bf498b4ec4d3254a06ba41e122096018265302a9e7c19962475cac58"} Apr 16 22:17:27.898391 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:27.898395 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f60918ff-7aa3-4049-8c22-166f6ccb9eaf","Type":"ContainerStarted","Data":"fe747444c20459732011e98e06de2ba3ffb40676d613e098e4947d33a4cf793d"} Apr 16 22:17:27.898819 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:27.898408 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f60918ff-7aa3-4049-8c22-166f6ccb9eaf","Type":"ContainerStarted","Data":"66936932ac8fa13d9bb7941b746be8de4ac896276cfa23ffb41e38a458f88e79"} Apr 16 22:17:27.898819 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:27.898420 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f60918ff-7aa3-4049-8c22-166f6ccb9eaf","Type":"ContainerStarted","Data":"5152eb15be65042c6bab3cbf984f5517a00a1d454265fe6ac1201d9549a24a82"} Apr 16 22:17:27.898819 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:27.898434 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f60918ff-7aa3-4049-8c22-166f6ccb9eaf","Type":"ContainerStarted","Data":"c9064b9edc96d2f9f19335519359d4ac8d2843b4630cae380d4fe91167106eb0"} Apr 16 22:17:27.898819 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:27.898446 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f60918ff-7aa3-4049-8c22-166f6ccb9eaf","Type":"ContainerStarted","Data":"4a0a5c64298c4e6a083b7003bfddb23804eb53e195febdee581e941730a89b6d"} Apr 16 22:17:27.927167 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:27.927122 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.927105961 podStartE2EDuration="2.927105961s" podCreationTimestamp="2026-04-16 22:17:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:17:27.924720717 +0000 UTC m=+240.407239852" watchObservedRunningTime="2026-04-16 22:17:27.927105961 +0000 UTC m=+240.409625096" Apr 16 22:17:28.407112 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.407079 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-d9b5c87-tgz2n"] Apr 16 22:17:28.411525 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.411506 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.415536 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.415510 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 22:17:28.415536 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.415527 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 22:17:28.415694 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.415674 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 22:17:28.415694 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.415680 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 22:17:28.415776 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.415703 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 22:17:28.415951 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.415938 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-8lk8r\"" Apr 16 22:17:28.425832 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.425806 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 22:17:28.426873 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.426851 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-d9b5c87-tgz2n"] Apr 16 22:17:28.458233 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.458194 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lrd9\" (UniqueName: \"kubernetes.io/projected/10d51dde-059f-426f-91fb-fce0490d41c3-kube-api-access-6lrd9\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.458393 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.458248 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/10d51dde-059f-426f-91fb-fce0490d41c3-secret-telemeter-client\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.458393 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.458300 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10d51dde-059f-426f-91fb-fce0490d41c3-metrics-client-ca\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.458393 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.458347 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10d51dde-059f-426f-91fb-fce0490d41c3-serving-certs-ca-bundle\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.458393 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.458365 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/10d51dde-059f-426f-91fb-fce0490d41c3-federate-client-tls\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.458554 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.458401 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/10d51dde-059f-426f-91fb-fce0490d41c3-telemeter-client-tls\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.458554 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.458418 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10d51dde-059f-426f-91fb-fce0490d41c3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.458554 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.458478 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/10d51dde-059f-426f-91fb-fce0490d41c3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.559853 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.559810 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lrd9\" (UniqueName: \"kubernetes.io/projected/10d51dde-059f-426f-91fb-fce0490d41c3-kube-api-access-6lrd9\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.559853 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.559859 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/10d51dde-059f-426f-91fb-fce0490d41c3-secret-telemeter-client\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.560037 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.559971 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10d51dde-059f-426f-91fb-fce0490d41c3-metrics-client-ca\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.560037 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.560018 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10d51dde-059f-426f-91fb-fce0490d41c3-serving-certs-ca-bundle\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.560136 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.560042 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/10d51dde-059f-426f-91fb-fce0490d41c3-federate-client-tls\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.560136 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.560100 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/10d51dde-059f-426f-91fb-fce0490d41c3-telemeter-client-tls\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.560136 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.560125 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10d51dde-059f-426f-91fb-fce0490d41c3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.560336 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.560155 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/10d51dde-059f-426f-91fb-fce0490d41c3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.560814 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.560787 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10d51dde-059f-426f-91fb-fce0490d41c3-serving-certs-ca-bundle\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.560948 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.560794 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/10d51dde-059f-426f-91fb-fce0490d41c3-metrics-client-ca\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.561015 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.560996 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10d51dde-059f-426f-91fb-fce0490d41c3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.562904 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.562885 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/10d51dde-059f-426f-91fb-fce0490d41c3-telemeter-client-tls\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.563023 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.563007 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/10d51dde-059f-426f-91fb-fce0490d41c3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.563092 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.563070 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/10d51dde-059f-426f-91fb-fce0490d41c3-secret-telemeter-client\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.563143 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.563089 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/10d51dde-059f-426f-91fb-fce0490d41c3-federate-client-tls\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.568379 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.568351 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lrd9\" (UniqueName: \"kubernetes.io/projected/10d51dde-059f-426f-91fb-fce0490d41c3-kube-api-access-6lrd9\") pod \"telemeter-client-d9b5c87-tgz2n\" (UID: \"10d51dde-059f-426f-91fb-fce0490d41c3\") " pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.721939 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.721857 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" Apr 16 22:17:28.836704 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.836317 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-d9b5c87-tgz2n"] Apr 16 22:17:28.841366 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:17:28.841335 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10d51dde_059f_426f_91fb_fce0490d41c3.slice/crio-9e1f8db8a49a594d45a797b0c3eefb66beee84381fb9e2d5b1a9f763a62a392c WatchSource:0}: Error finding container 9e1f8db8a49a594d45a797b0c3eefb66beee84381fb9e2d5b1a9f763a62a392c: Status 404 returned error can't find the container with id 9e1f8db8a49a594d45a797b0c3eefb66beee84381fb9e2d5b1a9f763a62a392c Apr 16 22:17:28.902323 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:28.902288 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" event={"ID":"10d51dde-059f-426f-91fb-fce0490d41c3","Type":"ContainerStarted","Data":"9e1f8db8a49a594d45a797b0c3eefb66beee84381fb9e2d5b1a9f763a62a392c"} Apr 16 22:17:30.910234 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:30.910191 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" event={"ID":"10d51dde-059f-426f-91fb-fce0490d41c3","Type":"ContainerStarted","Data":"40f8735ee17bc87bf4dde84b28ebabb14b3e2e7a3a9e331ed7cacb7760e87d9b"} Apr 16 22:17:30.910599 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:30.910239 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" event={"ID":"10d51dde-059f-426f-91fb-fce0490d41c3","Type":"ContainerStarted","Data":"4f926d5cabc5d95ff30ec66e5638fcaad15cbf8191a4e899cd5e69c5f0708967"} Apr 16 22:17:30.910599 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:30.910255 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" event={"ID":"10d51dde-059f-426f-91fb-fce0490d41c3","Type":"ContainerStarted","Data":"d1a072276a5bd8a16abd931c67492a3550a9cd572225f340946b5e0ed2d4a859"} Apr 16 22:17:30.931299 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:30.931244 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-d9b5c87-tgz2n" podStartSLOduration=1.265779371 podStartE2EDuration="2.931228022s" podCreationTimestamp="2026-04-16 22:17:28 +0000 UTC" firstStartedPulling="2026-04-16 22:17:28.843086477 +0000 UTC m=+241.325605590" lastFinishedPulling="2026-04-16 22:17:30.508535127 +0000 UTC m=+242.991054241" observedRunningTime="2026-04-16 22:17:30.93003266 +0000 UTC m=+243.412551815" watchObservedRunningTime="2026-04-16 22:17:30.931228022 +0000 UTC m=+243.413747156" Apr 16 22:17:31.532134 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.532104 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-85d76f6948-sxkp2"] Apr 16 22:17:31.535347 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.535327 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.546475 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.546448 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85d76f6948-sxkp2"] Apr 16 22:17:31.580539 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.580513 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-service-ca\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.580653 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.580546 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-console-serving-cert\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.580653 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.580564 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtsqg\" (UniqueName: \"kubernetes.io/projected/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-kube-api-access-jtsqg\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.580653 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.580580 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-oauth-serving-cert\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.580778 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.580669 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-trusted-ca-bundle\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.580778 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.580712 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-console-config\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.580778 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.580745 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-console-oauth-config\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.681733 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.681711 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-service-ca\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.681855 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.681745 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-console-serving-cert\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.681855 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.681762 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtsqg\" (UniqueName: \"kubernetes.io/projected/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-kube-api-access-jtsqg\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.681855 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.681777 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-oauth-serving-cert\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.681855 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.681804 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-trusted-ca-bundle\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.681855 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.681849 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-console-config\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.682106 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.681879 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-console-oauth-config\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.682553 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.682529 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-oauth-serving-cert\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.682663 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.682565 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-service-ca\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.682724 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.682705 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-trusted-ca-bundle\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.682779 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.682703 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-console-config\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.684302 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.684271 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-console-serving-cert\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.684381 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.684322 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-console-oauth-config\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.690836 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.690816 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtsqg\" (UniqueName: \"kubernetes.io/projected/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-kube-api-access-jtsqg\") pod \"console-85d76f6948-sxkp2\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.845187 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.845164 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:31.962914 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:31.962890 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85d76f6948-sxkp2"] Apr 16 22:17:31.965430 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:17:31.965404 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b3d9967_3904_4127_b4cb_6361dd3a1e9c.slice/crio-255824dbe0d6cf9fc3657d24aceb9ee062824eafa36b1d4ddf43f749721e457b WatchSource:0}: Error finding container 255824dbe0d6cf9fc3657d24aceb9ee062824eafa36b1d4ddf43f749721e457b: Status 404 returned error can't find the container with id 255824dbe0d6cf9fc3657d24aceb9ee062824eafa36b1d4ddf43f749721e457b Apr 16 22:17:32.920414 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:32.920378 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85d76f6948-sxkp2" event={"ID":"9b3d9967-3904-4127-b4cb-6361dd3a1e9c","Type":"ContainerStarted","Data":"a46b3ef4e5432f7ff41f0ac9795accef7f212eb191003f3d256c8ab8e6705a94"} Apr 16 22:17:32.920414 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:32.920414 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85d76f6948-sxkp2" event={"ID":"9b3d9967-3904-4127-b4cb-6361dd3a1e9c","Type":"ContainerStarted","Data":"255824dbe0d6cf9fc3657d24aceb9ee062824eafa36b1d4ddf43f749721e457b"} Apr 16 22:17:32.937971 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:32.937925 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85d76f6948-sxkp2" podStartSLOduration=1.9379100280000001 podStartE2EDuration="1.937910028s" podCreationTimestamp="2026-04-16 22:17:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:17:32.937377252 +0000 UTC m=+245.419896386" watchObservedRunningTime="2026-04-16 22:17:32.937910028 +0000 UTC m=+245.420429162" Apr 16 22:17:40.050557 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:40.050511 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs\") pod \"network-metrics-daemon-4zqvj\" (UID: \"ef0b8b85-4299-4164-b2f4-ae06377db331\") " pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:17:40.052896 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:40.052872 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ef0b8b85-4299-4164-b2f4-ae06377db331-metrics-certs\") pod \"network-metrics-daemon-4zqvj\" (UID: \"ef0b8b85-4299-4164-b2f4-ae06377db331\") " pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:17:40.229453 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:40.229422 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hqwt5\"" Apr 16 22:17:40.238082 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:40.238062 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zqvj" Apr 16 22:17:40.352747 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:40.352720 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4zqvj"] Apr 16 22:17:40.355341 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:17:40.355308 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef0b8b85_4299_4164_b2f4_ae06377db331.slice/crio-cddae4dbbf34ed6721674a19dee1e537c752a873991e29170dc3ce004de92a40 WatchSource:0}: Error finding container cddae4dbbf34ed6721674a19dee1e537c752a873991e29170dc3ce004de92a40: Status 404 returned error can't find the container with id cddae4dbbf34ed6721674a19dee1e537c752a873991e29170dc3ce004de92a40 Apr 16 22:17:40.944680 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:40.944639 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4zqvj" event={"ID":"ef0b8b85-4299-4164-b2f4-ae06377db331","Type":"ContainerStarted","Data":"cddae4dbbf34ed6721674a19dee1e537c752a873991e29170dc3ce004de92a40"} Apr 16 22:17:41.845832 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:41.845798 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:41.845832 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:41.845830 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:41.850427 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:41.850405 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:41.949503 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:41.949472 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4zqvj" event={"ID":"ef0b8b85-4299-4164-b2f4-ae06377db331","Type":"ContainerStarted","Data":"7569a759466a2e4b82b7aa8965f750da760a63b47bd4784c7085d2a602ad6bb9"} Apr 16 22:17:41.949503 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:41.949504 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4zqvj" event={"ID":"ef0b8b85-4299-4164-b2f4-ae06377db331","Type":"ContainerStarted","Data":"c5bceeaf48d0e11ac256d9a8ba7d4aef71d36ad41132900fad6f7e16350aec28"} Apr 16 22:17:41.953620 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:41.953577 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:17:41.967435 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:41.967156 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4zqvj" podStartSLOduration=253.095581387 podStartE2EDuration="4m13.96714275s" podCreationTimestamp="2026-04-16 22:13:28 +0000 UTC" firstStartedPulling="2026-04-16 22:17:40.357323389 +0000 UTC m=+252.839842502" lastFinishedPulling="2026-04-16 22:17:41.228884748 +0000 UTC m=+253.711403865" observedRunningTime="2026-04-16 22:17:41.965972655 +0000 UTC m=+254.448491789" watchObservedRunningTime="2026-04-16 22:17:41.96714275 +0000 UTC m=+254.449661885" Apr 16 22:17:42.011818 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:17:42.011792 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-ddb44ff65-4kt8q"] Apr 16 22:18:07.030803 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.030745 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-ddb44ff65-4kt8q" podUID="54b811b8-6501-4936-8da2-f9091a8042f0" containerName="console" containerID="cri-o://e68a7992dfc8c8dd48320513ca08bad3e5f88c56848ce22092fc2b95aada4165" gracePeriod=15 Apr 16 22:18:07.268442 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.268421 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-ddb44ff65-4kt8q_54b811b8-6501-4936-8da2-f9091a8042f0/console/0.log" Apr 16 22:18:07.268560 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.268482 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:18:07.358505 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.358465 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/54b811b8-6501-4936-8da2-f9091a8042f0-console-oauth-config\") pod \"54b811b8-6501-4936-8da2-f9091a8042f0\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " Apr 16 22:18:07.358505 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.358505 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc8g4\" (UniqueName: \"kubernetes.io/projected/54b811b8-6501-4936-8da2-f9091a8042f0-kube-api-access-cc8g4\") pod \"54b811b8-6501-4936-8da2-f9091a8042f0\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " Apr 16 22:18:07.358757 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.358538 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-trusted-ca-bundle\") pod \"54b811b8-6501-4936-8da2-f9091a8042f0\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " Apr 16 22:18:07.358757 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.358732 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/54b811b8-6501-4936-8da2-f9091a8042f0-console-serving-cert\") pod \"54b811b8-6501-4936-8da2-f9091a8042f0\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " Apr 16 22:18:07.358833 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.358811 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-oauth-serving-cert\") pod \"54b811b8-6501-4936-8da2-f9091a8042f0\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " Apr 16 22:18:07.358883 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.358871 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-console-config\") pod \"54b811b8-6501-4936-8da2-f9091a8042f0\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " Apr 16 22:18:07.358935 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.358899 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-service-ca\") pod \"54b811b8-6501-4936-8da2-f9091a8042f0\" (UID: \"54b811b8-6501-4936-8da2-f9091a8042f0\") " Apr 16 22:18:07.358993 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.358978 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "54b811b8-6501-4936-8da2-f9091a8042f0" (UID: "54b811b8-6501-4936-8da2-f9091a8042f0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:07.359187 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.359153 2562 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-trusted-ca-bundle\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:18:07.359390 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.359342 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "54b811b8-6501-4936-8da2-f9091a8042f0" (UID: "54b811b8-6501-4936-8da2-f9091a8042f0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:07.359390 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.359371 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-console-config" (OuterVolumeSpecName: "console-config") pod "54b811b8-6501-4936-8da2-f9091a8042f0" (UID: "54b811b8-6501-4936-8da2-f9091a8042f0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:07.359558 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.359444 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-service-ca" (OuterVolumeSpecName: "service-ca") pod "54b811b8-6501-4936-8da2-f9091a8042f0" (UID: "54b811b8-6501-4936-8da2-f9091a8042f0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:07.360957 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.360927 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b811b8-6501-4936-8da2-f9091a8042f0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "54b811b8-6501-4936-8da2-f9091a8042f0" (UID: "54b811b8-6501-4936-8da2-f9091a8042f0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:07.360957 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.360943 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b811b8-6501-4936-8da2-f9091a8042f0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "54b811b8-6501-4936-8da2-f9091a8042f0" (UID: "54b811b8-6501-4936-8da2-f9091a8042f0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:07.361092 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.361027 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54b811b8-6501-4936-8da2-f9091a8042f0-kube-api-access-cc8g4" (OuterVolumeSpecName: "kube-api-access-cc8g4") pod "54b811b8-6501-4936-8da2-f9091a8042f0" (UID: "54b811b8-6501-4936-8da2-f9091a8042f0"). InnerVolumeSpecName "kube-api-access-cc8g4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:18:07.460008 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.459975 2562 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/54b811b8-6501-4936-8da2-f9091a8042f0-console-oauth-config\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:18:07.460008 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.460002 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cc8g4\" (UniqueName: \"kubernetes.io/projected/54b811b8-6501-4936-8da2-f9091a8042f0-kube-api-access-cc8g4\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:18:07.460008 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.460012 2562 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/54b811b8-6501-4936-8da2-f9091a8042f0-console-serving-cert\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:18:07.460226 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.460021 2562 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-oauth-serving-cert\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:18:07.460226 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.460031 2562 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-console-config\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:18:07.460226 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:07.460040 2562 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/54b811b8-6501-4936-8da2-f9091a8042f0-service-ca\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:18:08.021408 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:08.021378 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-ddb44ff65-4kt8q_54b811b8-6501-4936-8da2-f9091a8042f0/console/0.log" Apr 16 22:18:08.021632 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:08.021417 2562 generic.go:358] "Generic (PLEG): container finished" podID="54b811b8-6501-4936-8da2-f9091a8042f0" containerID="e68a7992dfc8c8dd48320513ca08bad3e5f88c56848ce22092fc2b95aada4165" exitCode=2 Apr 16 22:18:08.021632 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:08.021473 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ddb44ff65-4kt8q" event={"ID":"54b811b8-6501-4936-8da2-f9091a8042f0","Type":"ContainerDied","Data":"e68a7992dfc8c8dd48320513ca08bad3e5f88c56848ce22092fc2b95aada4165"} Apr 16 22:18:08.021632 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:08.021491 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-ddb44ff65-4kt8q" Apr 16 22:18:08.021632 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:08.021506 2562 scope.go:117] "RemoveContainer" containerID="e68a7992dfc8c8dd48320513ca08bad3e5f88c56848ce22092fc2b95aada4165" Apr 16 22:18:08.021632 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:08.021496 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-ddb44ff65-4kt8q" event={"ID":"54b811b8-6501-4936-8da2-f9091a8042f0","Type":"ContainerDied","Data":"ad776579012e1ef088ccf0b8a7b630fdc69e76e7eac9475fdbfc2feae80d9ce5"} Apr 16 22:18:08.029874 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:08.029858 2562 scope.go:117] "RemoveContainer" containerID="e68a7992dfc8c8dd48320513ca08bad3e5f88c56848ce22092fc2b95aada4165" Apr 16 22:18:08.030160 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:18:08.030138 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e68a7992dfc8c8dd48320513ca08bad3e5f88c56848ce22092fc2b95aada4165\": container with ID starting with e68a7992dfc8c8dd48320513ca08bad3e5f88c56848ce22092fc2b95aada4165 not found: ID does not exist" containerID="e68a7992dfc8c8dd48320513ca08bad3e5f88c56848ce22092fc2b95aada4165" Apr 16 22:18:08.030225 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:08.030170 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68a7992dfc8c8dd48320513ca08bad3e5f88c56848ce22092fc2b95aada4165"} err="failed to get container status \"e68a7992dfc8c8dd48320513ca08bad3e5f88c56848ce22092fc2b95aada4165\": rpc error: code = NotFound desc = could not find container \"e68a7992dfc8c8dd48320513ca08bad3e5f88c56848ce22092fc2b95aada4165\": container with ID starting with e68a7992dfc8c8dd48320513ca08bad3e5f88c56848ce22092fc2b95aada4165 not found: ID does not exist" Apr 16 22:18:08.041569 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:08.041545 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-ddb44ff65-4kt8q"] Apr 16 22:18:08.047492 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:08.047471 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-ddb44ff65-4kt8q"] Apr 16 22:18:08.129487 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:08.129455 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54b811b8-6501-4936-8da2-f9091a8042f0" path="/var/lib/kubelet/pods/54b811b8-6501-4936-8da2-f9091a8042f0/volumes" Apr 16 22:18:28.004623 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:28.004586 2562 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 22:18:43.364819 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.364780 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-bc597bf69-zxj7c"] Apr 16 22:18:43.367282 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.365062 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54b811b8-6501-4936-8da2-f9091a8042f0" containerName="console" Apr 16 22:18:43.367282 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.365076 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b811b8-6501-4936-8da2-f9091a8042f0" containerName="console" Apr 16 22:18:43.367282 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.365129 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="54b811b8-6501-4936-8da2-f9091a8042f0" containerName="console" Apr 16 22:18:43.368122 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.368103 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.378806 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.378680 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bc597bf69-zxj7c"] Apr 16 22:18:43.401657 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.401625 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn2sj\" (UniqueName: \"kubernetes.io/projected/055f8963-a89a-47a1-bdd6-1bea09c15863-kube-api-access-bn2sj\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.401803 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.401669 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/055f8963-a89a-47a1-bdd6-1bea09c15863-console-serving-cert\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.401803 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.401689 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-trusted-ca-bundle\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.401803 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.401720 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-oauth-serving-cert\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.401803 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.401787 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/055f8963-a89a-47a1-bdd6-1bea09c15863-console-oauth-config\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.401933 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.401824 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-console-config\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.401933 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.401849 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-service-ca\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.503191 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.503143 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/055f8963-a89a-47a1-bdd6-1bea09c15863-console-oauth-config\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.503191 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.503194 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-console-config\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.503351 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.503305 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-service-ca\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.503386 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.503352 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bn2sj\" (UniqueName: \"kubernetes.io/projected/055f8963-a89a-47a1-bdd6-1bea09c15863-kube-api-access-bn2sj\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.503419 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.503402 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/055f8963-a89a-47a1-bdd6-1bea09c15863-console-serving-cert\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.503456 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.503426 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-trusted-ca-bundle\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.503505 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.503493 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-oauth-serving-cert\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.503973 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.503938 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-console-config\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.503973 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.503954 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-service-ca\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.504166 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.504145 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-oauth-serving-cert\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.504288 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.504268 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-trusted-ca-bundle\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.505853 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.505829 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/055f8963-a89a-47a1-bdd6-1bea09c15863-console-oauth-config\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.505936 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.505907 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/055f8963-a89a-47a1-bdd6-1bea09c15863-console-serving-cert\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.510676 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.510648 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn2sj\" (UniqueName: \"kubernetes.io/projected/055f8963-a89a-47a1-bdd6-1bea09c15863-kube-api-access-bn2sj\") pod \"console-bc597bf69-zxj7c\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.677253 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.677156 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:43.789480 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.789447 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bc597bf69-zxj7c"] Apr 16 22:18:43.792649 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:18:43.792588 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod055f8963_a89a_47a1_bdd6_1bea09c15863.slice/crio-774f49928f0279583c46b4aec8c1dc03c0a0c1f113156f5983b3391c795395bf WatchSource:0}: Error finding container 774f49928f0279583c46b4aec8c1dc03c0a0c1f113156f5983b3391c795395bf: Status 404 returned error can't find the container with id 774f49928f0279583c46b4aec8c1dc03c0a0c1f113156f5983b3391c795395bf Apr 16 22:18:43.794975 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:43.794950 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:18:44.123322 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:44.123285 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bc597bf69-zxj7c" event={"ID":"055f8963-a89a-47a1-bdd6-1bea09c15863","Type":"ContainerStarted","Data":"63471f637d0971a59c7928d723152acd89a1ca9bdc5d14410c4d5b1b20be5523"} Apr 16 22:18:44.123322 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:44.123324 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bc597bf69-zxj7c" event={"ID":"055f8963-a89a-47a1-bdd6-1bea09c15863","Type":"ContainerStarted","Data":"774f49928f0279583c46b4aec8c1dc03c0a0c1f113156f5983b3391c795395bf"} Apr 16 22:18:44.140014 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:44.139972 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bc597bf69-zxj7c" podStartSLOduration=1.139958918 podStartE2EDuration="1.139958918s" podCreationTimestamp="2026-04-16 22:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:18:44.139175272 +0000 UTC m=+316.621694415" watchObservedRunningTime="2026-04-16 22:18:44.139958918 +0000 UTC m=+316.622478053" Apr 16 22:18:53.677555 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:53.677515 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:53.677992 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:53.677839 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:53.682281 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:53.682253 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:54.154506 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:54.154476 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:18:54.200638 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:18:54.200590 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85d76f6948-sxkp2"] Apr 16 22:19:19.220721 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.220580 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-85d76f6948-sxkp2" podUID="9b3d9967-3904-4127-b4cb-6361dd3a1e9c" containerName="console" containerID="cri-o://a46b3ef4e5432f7ff41f0ac9795accef7f212eb191003f3d256c8ab8e6705a94" gracePeriod=15 Apr 16 22:19:19.457775 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.457752 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85d76f6948-sxkp2_9b3d9967-3904-4127-b4cb-6361dd3a1e9c/console/0.log" Apr 16 22:19:19.457879 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.457813 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:19:19.561586 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.561561 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-service-ca\") pod \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " Apr 16 22:19:19.561762 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.561623 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-oauth-serving-cert\") pod \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " Apr 16 22:19:19.561762 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.561651 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-console-config\") pod \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " Apr 16 22:19:19.561762 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.561671 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtsqg\" (UniqueName: \"kubernetes.io/projected/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-kube-api-access-jtsqg\") pod \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " Apr 16 22:19:19.561762 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.561688 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-trusted-ca-bundle\") pod \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " Apr 16 22:19:19.561762 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.561707 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-console-serving-cert\") pod \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " Apr 16 22:19:19.561762 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.561747 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-console-oauth-config\") pod \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\" (UID: \"9b3d9967-3904-4127-b4cb-6361dd3a1e9c\") " Apr 16 22:19:19.562062 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.562039 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-console-config" (OuterVolumeSpecName: "console-config") pod "9b3d9967-3904-4127-b4cb-6361dd3a1e9c" (UID: "9b3d9967-3904-4127-b4cb-6361dd3a1e9c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:19:19.562119 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.562066 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-service-ca" (OuterVolumeSpecName: "service-ca") pod "9b3d9967-3904-4127-b4cb-6361dd3a1e9c" (UID: "9b3d9967-3904-4127-b4cb-6361dd3a1e9c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:19:19.562119 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.562075 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9b3d9967-3904-4127-b4cb-6361dd3a1e9c" (UID: "9b3d9967-3904-4127-b4cb-6361dd3a1e9c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:19:19.562238 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.562216 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9b3d9967-3904-4127-b4cb-6361dd3a1e9c" (UID: "9b3d9967-3904-4127-b4cb-6361dd3a1e9c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:19:19.563845 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.563807 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-kube-api-access-jtsqg" (OuterVolumeSpecName: "kube-api-access-jtsqg") pod "9b3d9967-3904-4127-b4cb-6361dd3a1e9c" (UID: "9b3d9967-3904-4127-b4cb-6361dd3a1e9c"). InnerVolumeSpecName "kube-api-access-jtsqg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:19:19.563960 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.563906 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9b3d9967-3904-4127-b4cb-6361dd3a1e9c" (UID: "9b3d9967-3904-4127-b4cb-6361dd3a1e9c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:19:19.563960 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.563950 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9b3d9967-3904-4127-b4cb-6361dd3a1e9c" (UID: "9b3d9967-3904-4127-b4cb-6361dd3a1e9c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:19:19.663180 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.663133 2562 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-console-serving-cert\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:19:19.663180 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.663173 2562 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-console-oauth-config\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:19:19.663180 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.663183 2562 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-service-ca\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:19:19.663180 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.663194 2562 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-oauth-serving-cert\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:19:19.663180 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.663204 2562 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-console-config\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:19:19.663470 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.663213 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jtsqg\" (UniqueName: \"kubernetes.io/projected/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-kube-api-access-jtsqg\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:19:19.663470 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:19.663222 2562 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b3d9967-3904-4127-b4cb-6361dd3a1e9c-trusted-ca-bundle\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:19:20.215083 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:20.215058 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85d76f6948-sxkp2_9b3d9967-3904-4127-b4cb-6361dd3a1e9c/console/0.log" Apr 16 22:19:20.215231 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:20.215099 2562 generic.go:358] "Generic (PLEG): container finished" podID="9b3d9967-3904-4127-b4cb-6361dd3a1e9c" containerID="a46b3ef4e5432f7ff41f0ac9795accef7f212eb191003f3d256c8ab8e6705a94" exitCode=2 Apr 16 22:19:20.215231 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:20.215144 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85d76f6948-sxkp2" event={"ID":"9b3d9967-3904-4127-b4cb-6361dd3a1e9c","Type":"ContainerDied","Data":"a46b3ef4e5432f7ff41f0ac9795accef7f212eb191003f3d256c8ab8e6705a94"} Apr 16 22:19:20.215231 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:20.215171 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85d76f6948-sxkp2" event={"ID":"9b3d9967-3904-4127-b4cb-6361dd3a1e9c","Type":"ContainerDied","Data":"255824dbe0d6cf9fc3657d24aceb9ee062824eafa36b1d4ddf43f749721e457b"} Apr 16 22:19:20.215231 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:20.215174 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85d76f6948-sxkp2" Apr 16 22:19:20.215231 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:20.215187 2562 scope.go:117] "RemoveContainer" containerID="a46b3ef4e5432f7ff41f0ac9795accef7f212eb191003f3d256c8ab8e6705a94" Apr 16 22:19:20.223599 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:20.223459 2562 scope.go:117] "RemoveContainer" containerID="a46b3ef4e5432f7ff41f0ac9795accef7f212eb191003f3d256c8ab8e6705a94" Apr 16 22:19:20.223860 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:19:20.223751 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a46b3ef4e5432f7ff41f0ac9795accef7f212eb191003f3d256c8ab8e6705a94\": container with ID starting with a46b3ef4e5432f7ff41f0ac9795accef7f212eb191003f3d256c8ab8e6705a94 not found: ID does not exist" containerID="a46b3ef4e5432f7ff41f0ac9795accef7f212eb191003f3d256c8ab8e6705a94" Apr 16 22:19:20.223860 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:20.223775 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a46b3ef4e5432f7ff41f0ac9795accef7f212eb191003f3d256c8ab8e6705a94"} err="failed to get container status \"a46b3ef4e5432f7ff41f0ac9795accef7f212eb191003f3d256c8ab8e6705a94\": rpc error: code = NotFound desc = could not find container \"a46b3ef4e5432f7ff41f0ac9795accef7f212eb191003f3d256c8ab8e6705a94\": container with ID starting with a46b3ef4e5432f7ff41f0ac9795accef7f212eb191003f3d256c8ab8e6705a94 not found: ID does not exist" Apr 16 22:19:20.232312 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:20.232283 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85d76f6948-sxkp2"] Apr 16 22:19:20.235831 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:20.235810 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-85d76f6948-sxkp2"] Apr 16 22:19:22.130393 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:22.130350 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b3d9967-3904-4127-b4cb-6361dd3a1e9c" path="/var/lib/kubelet/pods/9b3d9967-3904-4127-b4cb-6361dd3a1e9c/volumes" Apr 16 22:19:38.052502 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.052469 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp"] Apr 16 22:19:38.052969 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.052835 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b3d9967-3904-4127-b4cb-6361dd3a1e9c" containerName="console" Apr 16 22:19:38.052969 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.052849 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b3d9967-3904-4127-b4cb-6361dd3a1e9c" containerName="console" Apr 16 22:19:38.052969 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.052914 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b3d9967-3904-4127-b4cb-6361dd3a1e9c" containerName="console" Apr 16 22:19:38.057732 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.057715 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp" Apr 16 22:19:38.060546 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.060527 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rwfcd\"" Apr 16 22:19:38.060850 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.060833 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 22:19:38.061265 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.061250 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 22:19:38.070714 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.067943 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp"] Apr 16 22:19:38.205398 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.205370 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ffd93146-ff80-48f7-9c41-e810e7b539c5-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp\" (UID: \"ffd93146-ff80-48f7-9c41-e810e7b539c5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp" Apr 16 22:19:38.205501 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.205423 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ffd93146-ff80-48f7-9c41-e810e7b539c5-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp\" (UID: \"ffd93146-ff80-48f7-9c41-e810e7b539c5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp" Apr 16 22:19:38.205501 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.205445 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5nfp\" (UniqueName: \"kubernetes.io/projected/ffd93146-ff80-48f7-9c41-e810e7b539c5-kube-api-access-k5nfp\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp\" (UID: \"ffd93146-ff80-48f7-9c41-e810e7b539c5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp" Apr 16 22:19:38.306301 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.306244 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ffd93146-ff80-48f7-9c41-e810e7b539c5-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp\" (UID: \"ffd93146-ff80-48f7-9c41-e810e7b539c5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp" Apr 16 22:19:38.306301 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.306288 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ffd93146-ff80-48f7-9c41-e810e7b539c5-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp\" (UID: \"ffd93146-ff80-48f7-9c41-e810e7b539c5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp" Apr 16 22:19:38.306420 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.306306 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5nfp\" (UniqueName: \"kubernetes.io/projected/ffd93146-ff80-48f7-9c41-e810e7b539c5-kube-api-access-k5nfp\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp\" (UID: \"ffd93146-ff80-48f7-9c41-e810e7b539c5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp" Apr 16 22:19:38.306700 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.306678 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ffd93146-ff80-48f7-9c41-e810e7b539c5-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp\" (UID: \"ffd93146-ff80-48f7-9c41-e810e7b539c5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp" Apr 16 22:19:38.306741 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.306699 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ffd93146-ff80-48f7-9c41-e810e7b539c5-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp\" (UID: \"ffd93146-ff80-48f7-9c41-e810e7b539c5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp" Apr 16 22:19:38.314469 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.314450 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5nfp\" (UniqueName: \"kubernetes.io/projected/ffd93146-ff80-48f7-9c41-e810e7b539c5-kube-api-access-k5nfp\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp\" (UID: \"ffd93146-ff80-48f7-9c41-e810e7b539c5\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp" Apr 16 22:19:38.367912 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.367884 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp" Apr 16 22:19:38.480955 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:38.480919 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp"] Apr 16 22:19:38.483741 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:19:38.483701 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffd93146_ff80_48f7_9c41_e810e7b539c5.slice/crio-0e89682e53df8dd876a2954fa44a53115b439b08bd5905d4d251cc44e609b6d3 WatchSource:0}: Error finding container 0e89682e53df8dd876a2954fa44a53115b439b08bd5905d4d251cc44e609b6d3: Status 404 returned error can't find the container with id 0e89682e53df8dd876a2954fa44a53115b439b08bd5905d4d251cc44e609b6d3 Apr 16 22:19:39.269799 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:39.269766 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp" event={"ID":"ffd93146-ff80-48f7-9c41-e810e7b539c5","Type":"ContainerStarted","Data":"0e89682e53df8dd876a2954fa44a53115b439b08bd5905d4d251cc44e609b6d3"} Apr 16 22:19:44.288838 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:44.288784 2562 generic.go:358] "Generic (PLEG): container finished" podID="ffd93146-ff80-48f7-9c41-e810e7b539c5" containerID="c1159c18611ac0c510fab251b4eff76a72df275a7bfb43ffd6816e821c49040e" exitCode=0 Apr 16 22:19:44.289172 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:44.288856 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp" event={"ID":"ffd93146-ff80-48f7-9c41-e810e7b539c5","Type":"ContainerDied","Data":"c1159c18611ac0c510fab251b4eff76a72df275a7bfb43ffd6816e821c49040e"} Apr 16 22:19:46.298346 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:46.298315 2562 generic.go:358] "Generic (PLEG): container finished" podID="ffd93146-ff80-48f7-9c41-e810e7b539c5" containerID="3aa3b049c5590d97c6635bf80e1b666daefa24ee2418966998b1b12ff3d1af25" exitCode=0 Apr 16 22:19:46.298731 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:46.298372 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp" event={"ID":"ffd93146-ff80-48f7-9c41-e810e7b539c5","Type":"ContainerDied","Data":"3aa3b049c5590d97c6635bf80e1b666daefa24ee2418966998b1b12ff3d1af25"} Apr 16 22:19:52.319533 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:52.319505 2562 generic.go:358] "Generic (PLEG): container finished" podID="ffd93146-ff80-48f7-9c41-e810e7b539c5" containerID="f750eb9309069ef3a5a84ffe2c188d9cbc7b599874d439bf4ff0bd7c4a9adf9a" exitCode=0 Apr 16 22:19:52.319899 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:52.319547 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp" event={"ID":"ffd93146-ff80-48f7-9c41-e810e7b539c5","Type":"ContainerDied","Data":"f750eb9309069ef3a5a84ffe2c188d9cbc7b599874d439bf4ff0bd7c4a9adf9a"} Apr 16 22:19:53.438096 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:53.438076 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp" Apr 16 22:19:53.623774 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:53.623681 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ffd93146-ff80-48f7-9c41-e810e7b539c5-util\") pod \"ffd93146-ff80-48f7-9c41-e810e7b539c5\" (UID: \"ffd93146-ff80-48f7-9c41-e810e7b539c5\") " Apr 16 22:19:53.623774 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:53.623736 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5nfp\" (UniqueName: \"kubernetes.io/projected/ffd93146-ff80-48f7-9c41-e810e7b539c5-kube-api-access-k5nfp\") pod \"ffd93146-ff80-48f7-9c41-e810e7b539c5\" (UID: \"ffd93146-ff80-48f7-9c41-e810e7b539c5\") " Apr 16 22:19:53.623774 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:53.623767 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ffd93146-ff80-48f7-9c41-e810e7b539c5-bundle\") pod \"ffd93146-ff80-48f7-9c41-e810e7b539c5\" (UID: \"ffd93146-ff80-48f7-9c41-e810e7b539c5\") " Apr 16 22:19:53.624465 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:53.624437 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffd93146-ff80-48f7-9c41-e810e7b539c5-bundle" (OuterVolumeSpecName: "bundle") pod "ffd93146-ff80-48f7-9c41-e810e7b539c5" (UID: "ffd93146-ff80-48f7-9c41-e810e7b539c5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:19:53.627692 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:53.626664 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffd93146-ff80-48f7-9c41-e810e7b539c5-kube-api-access-k5nfp" (OuterVolumeSpecName: "kube-api-access-k5nfp") pod "ffd93146-ff80-48f7-9c41-e810e7b539c5" (UID: "ffd93146-ff80-48f7-9c41-e810e7b539c5"). InnerVolumeSpecName "kube-api-access-k5nfp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:19:53.628878 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:53.628851 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffd93146-ff80-48f7-9c41-e810e7b539c5-util" (OuterVolumeSpecName: "util") pod "ffd93146-ff80-48f7-9c41-e810e7b539c5" (UID: "ffd93146-ff80-48f7-9c41-e810e7b539c5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:19:53.724843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:53.724815 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k5nfp\" (UniqueName: \"kubernetes.io/projected/ffd93146-ff80-48f7-9c41-e810e7b539c5-kube-api-access-k5nfp\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:19:53.724843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:53.724839 2562 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ffd93146-ff80-48f7-9c41-e810e7b539c5-bundle\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:19:53.724843 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:53.724849 2562 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ffd93146-ff80-48f7-9c41-e810e7b539c5-util\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:19:54.326187 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:54.326149 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp" event={"ID":"ffd93146-ff80-48f7-9c41-e810e7b539c5","Type":"ContainerDied","Data":"0e89682e53df8dd876a2954fa44a53115b439b08bd5905d4d251cc44e609b6d3"} Apr 16 22:19:54.326187 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:54.326184 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e89682e53df8dd876a2954fa44a53115b439b08bd5905d4d251cc44e609b6d3" Apr 16 22:19:54.326187 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:54.326182 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cct5mp" Apr 16 22:19:59.716059 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.716019 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6"] Apr 16 22:19:59.716462 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.716280 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ffd93146-ff80-48f7-9c41-e810e7b539c5" containerName="util" Apr 16 22:19:59.716462 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.716291 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd93146-ff80-48f7-9c41-e810e7b539c5" containerName="util" Apr 16 22:19:59.716462 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.716302 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ffd93146-ff80-48f7-9c41-e810e7b539c5" containerName="pull" Apr 16 22:19:59.716462 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.716308 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd93146-ff80-48f7-9c41-e810e7b539c5" containerName="pull" Apr 16 22:19:59.716462 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.716330 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ffd93146-ff80-48f7-9c41-e810e7b539c5" containerName="extract" Apr 16 22:19:59.716462 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.716337 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd93146-ff80-48f7-9c41-e810e7b539c5" containerName="extract" Apr 16 22:19:59.716462 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.716382 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="ffd93146-ff80-48f7-9c41-e810e7b539c5" containerName="extract" Apr 16 22:19:59.772501 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.772471 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6"] Apr 16 22:19:59.772645 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.772583 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6" Apr 16 22:19:59.775010 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.774975 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 22:19:59.775162 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.775109 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-pjsnc\"" Apr 16 22:19:59.775162 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.775156 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 22:19:59.775259 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.775164 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 22:19:59.864087 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.864060 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv292\" (UniqueName: \"kubernetes.io/projected/1b6d77f6-2402-4de2-87b3-9d6b6fb950b2-kube-api-access-pv292\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6\" (UID: \"1b6d77f6-2402-4de2-87b3-9d6b6fb950b2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6" Apr 16 22:19:59.864209 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.864091 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/1b6d77f6-2402-4de2-87b3-9d6b6fb950b2-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6\" (UID: \"1b6d77f6-2402-4de2-87b3-9d6b6fb950b2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6" Apr 16 22:19:59.964589 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.964564 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pv292\" (UniqueName: \"kubernetes.io/projected/1b6d77f6-2402-4de2-87b3-9d6b6fb950b2-kube-api-access-pv292\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6\" (UID: \"1b6d77f6-2402-4de2-87b3-9d6b6fb950b2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6" Apr 16 22:19:59.964713 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.964595 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/1b6d77f6-2402-4de2-87b3-9d6b6fb950b2-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6\" (UID: \"1b6d77f6-2402-4de2-87b3-9d6b6fb950b2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6" Apr 16 22:19:59.966770 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.966721 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/1b6d77f6-2402-4de2-87b3-9d6b6fb950b2-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6\" (UID: \"1b6d77f6-2402-4de2-87b3-9d6b6fb950b2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6" Apr 16 22:19:59.973623 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:19:59.973584 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv292\" (UniqueName: \"kubernetes.io/projected/1b6d77f6-2402-4de2-87b3-9d6b6fb950b2-kube-api-access-pv292\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6\" (UID: \"1b6d77f6-2402-4de2-87b3-9d6b6fb950b2\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6" Apr 16 22:20:00.083326 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:00.083307 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6" Apr 16 22:20:00.200694 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:00.200672 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6"] Apr 16 22:20:00.203492 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:20:00.203466 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b6d77f6_2402_4de2_87b3_9d6b6fb950b2.slice/crio-a67a12643b391cdad72720bc9ca5586d2b6d1da7ae564cc1bee6d94bef0537f9 WatchSource:0}: Error finding container a67a12643b391cdad72720bc9ca5586d2b6d1da7ae564cc1bee6d94bef0537f9: Status 404 returned error can't find the container with id a67a12643b391cdad72720bc9ca5586d2b6d1da7ae564cc1bee6d94bef0537f9 Apr 16 22:20:00.342994 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:00.342962 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6" event={"ID":"1b6d77f6-2402-4de2-87b3-9d6b6fb950b2","Type":"ContainerStarted","Data":"a67a12643b391cdad72720bc9ca5586d2b6d1da7ae564cc1bee6d94bef0537f9"} Apr 16 22:20:04.362273 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:04.362240 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6" event={"ID":"1b6d77f6-2402-4de2-87b3-9d6b6fb950b2","Type":"ContainerStarted","Data":"0e4e7384c105a1625bfabf66f8984b9f75b222f47e36d9b241792bb3ef9243e3"} Apr 16 22:20:04.362628 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:04.362412 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6" Apr 16 22:20:04.383419 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:04.383371 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6" podStartSLOduration=2.121561825 podStartE2EDuration="5.383359636s" podCreationTimestamp="2026-04-16 22:19:59 +0000 UTC" firstStartedPulling="2026-04-16 22:20:00.205163799 +0000 UTC m=+392.687682913" lastFinishedPulling="2026-04-16 22:20:03.466961597 +0000 UTC m=+395.949480724" observedRunningTime="2026-04-16 22:20:04.381996412 +0000 UTC m=+396.864515547" watchObservedRunningTime="2026-04-16 22:20:04.383359636 +0000 UTC m=+396.865878771" Apr 16 22:20:04.663169 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:04.663085 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-6hbkt"] Apr 16 22:20:04.666159 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:04.666142 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6hbkt" Apr 16 22:20:04.668389 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:04.668365 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 22:20:04.668525 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:04.668375 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 22:20:04.668613 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:04.668586 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-zxc22\"" Apr 16 22:20:04.673776 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:04.673755 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6hbkt"] Apr 16 22:20:04.696352 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:04.696322 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q27bm\" (UniqueName: \"kubernetes.io/projected/53a2e7df-8595-4cae-8dc5-0466b14ec382-kube-api-access-q27bm\") pod \"keda-admission-cf49989db-6hbkt\" (UID: \"53a2e7df-8595-4cae-8dc5-0466b14ec382\") " pod="openshift-keda/keda-admission-cf49989db-6hbkt" Apr 16 22:20:04.696488 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:04.696359 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/53a2e7df-8595-4cae-8dc5-0466b14ec382-certificates\") pod \"keda-admission-cf49989db-6hbkt\" (UID: \"53a2e7df-8595-4cae-8dc5-0466b14ec382\") " pod="openshift-keda/keda-admission-cf49989db-6hbkt" Apr 16 22:20:04.797486 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:04.797449 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q27bm\" (UniqueName: \"kubernetes.io/projected/53a2e7df-8595-4cae-8dc5-0466b14ec382-kube-api-access-q27bm\") pod \"keda-admission-cf49989db-6hbkt\" (UID: \"53a2e7df-8595-4cae-8dc5-0466b14ec382\") " pod="openshift-keda/keda-admission-cf49989db-6hbkt" Apr 16 22:20:04.797486 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:04.797491 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/53a2e7df-8595-4cae-8dc5-0466b14ec382-certificates\") pod \"keda-admission-cf49989db-6hbkt\" (UID: \"53a2e7df-8595-4cae-8dc5-0466b14ec382\") " pod="openshift-keda/keda-admission-cf49989db-6hbkt" Apr 16 22:20:04.797778 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:20:04.797596 2562 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 16 22:20:04.797778 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:20:04.797634 2562 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-6hbkt: secret "keda-admission-webhooks-certs" not found Apr 16 22:20:04.797778 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:20:04.797684 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/53a2e7df-8595-4cae-8dc5-0466b14ec382-certificates podName:53a2e7df-8595-4cae-8dc5-0466b14ec382 nodeName:}" failed. No retries permitted until 2026-04-16 22:20:05.297669343 +0000 UTC m=+397.780188455 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/53a2e7df-8595-4cae-8dc5-0466b14ec382-certificates") pod "keda-admission-cf49989db-6hbkt" (UID: "53a2e7df-8595-4cae-8dc5-0466b14ec382") : secret "keda-admission-webhooks-certs" not found Apr 16 22:20:04.808297 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:04.808261 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q27bm\" (UniqueName: \"kubernetes.io/projected/53a2e7df-8595-4cae-8dc5-0466b14ec382-kube-api-access-q27bm\") pod \"keda-admission-cf49989db-6hbkt\" (UID: \"53a2e7df-8595-4cae-8dc5-0466b14ec382\") " pod="openshift-keda/keda-admission-cf49989db-6hbkt" Apr 16 22:20:05.301074 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:05.301036 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/53a2e7df-8595-4cae-8dc5-0466b14ec382-certificates\") pod \"keda-admission-cf49989db-6hbkt\" (UID: \"53a2e7df-8595-4cae-8dc5-0466b14ec382\") " pod="openshift-keda/keda-admission-cf49989db-6hbkt" Apr 16 22:20:05.303407 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:05.303378 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/53a2e7df-8595-4cae-8dc5-0466b14ec382-certificates\") pod \"keda-admission-cf49989db-6hbkt\" (UID: \"53a2e7df-8595-4cae-8dc5-0466b14ec382\") " pod="openshift-keda/keda-admission-cf49989db-6hbkt" Apr 16 22:20:05.576969 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:05.576872 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-6hbkt" Apr 16 22:20:05.703046 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:05.703016 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-6hbkt"] Apr 16 22:20:05.705926 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:20:05.705899 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53a2e7df_8595_4cae_8dc5_0466b14ec382.slice/crio-5b618ddadfe61fd01b84224518f7bec7eb6fd1ee6b5e97b504968edd6a0ce879 WatchSource:0}: Error finding container 5b618ddadfe61fd01b84224518f7bec7eb6fd1ee6b5e97b504968edd6a0ce879: Status 404 returned error can't find the container with id 5b618ddadfe61fd01b84224518f7bec7eb6fd1ee6b5e97b504968edd6a0ce879 Apr 16 22:20:06.369210 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:06.369169 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6hbkt" event={"ID":"53a2e7df-8595-4cae-8dc5-0466b14ec382","Type":"ContainerStarted","Data":"5b618ddadfe61fd01b84224518f7bec7eb6fd1ee6b5e97b504968edd6a0ce879"} Apr 16 22:20:07.373287 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:07.373253 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-6hbkt" event={"ID":"53a2e7df-8595-4cae-8dc5-0466b14ec382","Type":"ContainerStarted","Data":"92aff4086768d7e1495268153a7b67405ff8659fd93ca6e900ef56c5aabbebfd"} Apr 16 22:20:07.373629 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:07.373365 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-6hbkt" Apr 16 22:20:07.389557 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:07.389514 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-6hbkt" podStartSLOduration=2.121536149 podStartE2EDuration="3.389500426s" podCreationTimestamp="2026-04-16 22:20:04 +0000 UTC" firstStartedPulling="2026-04-16 22:20:05.707597824 +0000 UTC m=+398.190116938" lastFinishedPulling="2026-04-16 22:20:06.975562099 +0000 UTC m=+399.458081215" observedRunningTime="2026-04-16 22:20:07.388512512 +0000 UTC m=+399.871031647" watchObservedRunningTime="2026-04-16 22:20:07.389500426 +0000 UTC m=+399.872019565" Apr 16 22:20:25.367123 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:25.367090 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-rn9z6" Apr 16 22:20:28.378053 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:28.378022 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-6hbkt" Apr 16 22:20:58.217369 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:58.217337 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8"] Apr 16 22:20:58.219724 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:58.219702 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8" Apr 16 22:20:58.222287 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:58.222254 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 22:20:58.222975 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:58.222957 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rwfcd\"" Apr 16 22:20:58.223075 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:58.222995 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 22:20:58.228630 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:58.228595 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8"] Apr 16 22:20:58.262397 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:58.262364 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f21ee7a0-18a4-4293-9d1d-d6f66b519cba-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8\" (UID: \"f21ee7a0-18a4-4293-9d1d-d6f66b519cba\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8" Apr 16 22:20:58.262534 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:58.262441 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29qlf\" (UniqueName: \"kubernetes.io/projected/f21ee7a0-18a4-4293-9d1d-d6f66b519cba-kube-api-access-29qlf\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8\" (UID: \"f21ee7a0-18a4-4293-9d1d-d6f66b519cba\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8" Apr 16 22:20:58.262534 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:58.262510 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f21ee7a0-18a4-4293-9d1d-d6f66b519cba-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8\" (UID: \"f21ee7a0-18a4-4293-9d1d-d6f66b519cba\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8" Apr 16 22:20:58.362997 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:58.362954 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f21ee7a0-18a4-4293-9d1d-d6f66b519cba-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8\" (UID: \"f21ee7a0-18a4-4293-9d1d-d6f66b519cba\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8" Apr 16 22:20:58.363163 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:58.363043 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f21ee7a0-18a4-4293-9d1d-d6f66b519cba-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8\" (UID: \"f21ee7a0-18a4-4293-9d1d-d6f66b519cba\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8" Apr 16 22:20:58.363163 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:58.363126 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29qlf\" (UniqueName: \"kubernetes.io/projected/f21ee7a0-18a4-4293-9d1d-d6f66b519cba-kube-api-access-29qlf\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8\" (UID: \"f21ee7a0-18a4-4293-9d1d-d6f66b519cba\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8" Apr 16 22:20:58.363399 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:58.363380 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f21ee7a0-18a4-4293-9d1d-d6f66b519cba-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8\" (UID: \"f21ee7a0-18a4-4293-9d1d-d6f66b519cba\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8" Apr 16 22:20:58.363434 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:58.363393 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f21ee7a0-18a4-4293-9d1d-d6f66b519cba-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8\" (UID: \"f21ee7a0-18a4-4293-9d1d-d6f66b519cba\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8" Apr 16 22:20:58.371316 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:58.371293 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29qlf\" (UniqueName: \"kubernetes.io/projected/f21ee7a0-18a4-4293-9d1d-d6f66b519cba-kube-api-access-29qlf\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8\" (UID: \"f21ee7a0-18a4-4293-9d1d-d6f66b519cba\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8" Apr 16 22:20:58.529488 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:58.529416 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8" Apr 16 22:20:58.639937 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:58.639914 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8"] Apr 16 22:20:58.642449 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:20:58.642422 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf21ee7a0_18a4_4293_9d1d_d6f66b519cba.slice/crio-0e7e3ba1de4f8d435cc557d4434a24b9cba298612b64feef19c4f7e100c8853d WatchSource:0}: Error finding container 0e7e3ba1de4f8d435cc557d4434a24b9cba298612b64feef19c4f7e100c8853d: Status 404 returned error can't find the container with id 0e7e3ba1de4f8d435cc557d4434a24b9cba298612b64feef19c4f7e100c8853d Apr 16 22:20:59.522993 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:59.522955 2562 generic.go:358] "Generic (PLEG): container finished" podID="f21ee7a0-18a4-4293-9d1d-d6f66b519cba" containerID="c10fd2a05600a8d0047a58d5b7a47c16268e87992247e93faffe0e6e9145e94c" exitCode=0 Apr 16 22:20:59.523328 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:59.523044 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8" event={"ID":"f21ee7a0-18a4-4293-9d1d-d6f66b519cba","Type":"ContainerDied","Data":"c10fd2a05600a8d0047a58d5b7a47c16268e87992247e93faffe0e6e9145e94c"} Apr 16 22:20:59.523328 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:20:59.523077 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8" event={"ID":"f21ee7a0-18a4-4293-9d1d-d6f66b519cba","Type":"ContainerStarted","Data":"0e7e3ba1de4f8d435cc557d4434a24b9cba298612b64feef19c4f7e100c8853d"} Apr 16 22:21:01.531573 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:01.531538 2562 generic.go:358] "Generic (PLEG): container finished" podID="f21ee7a0-18a4-4293-9d1d-d6f66b519cba" containerID="ea372bed3979a66ca9e25a0a7ac6d4f6c878c482aebe21c14f14d59fb3b25b8d" exitCode=0 Apr 16 22:21:01.531963 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:01.531625 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8" event={"ID":"f21ee7a0-18a4-4293-9d1d-d6f66b519cba","Type":"ContainerDied","Data":"ea372bed3979a66ca9e25a0a7ac6d4f6c878c482aebe21c14f14d59fb3b25b8d"} Apr 16 22:21:02.537781 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:02.537749 2562 generic.go:358] "Generic (PLEG): container finished" podID="f21ee7a0-18a4-4293-9d1d-d6f66b519cba" containerID="a6296a09acafd82bd31943b95fb26109366dd996c97530dccf721de3780599f2" exitCode=0 Apr 16 22:21:02.538120 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:02.537800 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8" event={"ID":"f21ee7a0-18a4-4293-9d1d-d6f66b519cba","Type":"ContainerDied","Data":"a6296a09acafd82bd31943b95fb26109366dd996c97530dccf721de3780599f2"} Apr 16 22:21:03.652311 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:03.652289 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8" Apr 16 22:21:03.709319 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:03.709297 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f21ee7a0-18a4-4293-9d1d-d6f66b519cba-bundle\") pod \"f21ee7a0-18a4-4293-9d1d-d6f66b519cba\" (UID: \"f21ee7a0-18a4-4293-9d1d-d6f66b519cba\") " Apr 16 22:21:03.709480 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:03.709334 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29qlf\" (UniqueName: \"kubernetes.io/projected/f21ee7a0-18a4-4293-9d1d-d6f66b519cba-kube-api-access-29qlf\") pod \"f21ee7a0-18a4-4293-9d1d-d6f66b519cba\" (UID: \"f21ee7a0-18a4-4293-9d1d-d6f66b519cba\") " Apr 16 22:21:03.709480 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:03.709386 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f21ee7a0-18a4-4293-9d1d-d6f66b519cba-util\") pod \"f21ee7a0-18a4-4293-9d1d-d6f66b519cba\" (UID: \"f21ee7a0-18a4-4293-9d1d-d6f66b519cba\") " Apr 16 22:21:03.710042 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:03.710018 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f21ee7a0-18a4-4293-9d1d-d6f66b519cba-bundle" (OuterVolumeSpecName: "bundle") pod "f21ee7a0-18a4-4293-9d1d-d6f66b519cba" (UID: "f21ee7a0-18a4-4293-9d1d-d6f66b519cba"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:21:03.711282 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:03.711262 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f21ee7a0-18a4-4293-9d1d-d6f66b519cba-kube-api-access-29qlf" (OuterVolumeSpecName: "kube-api-access-29qlf") pod "f21ee7a0-18a4-4293-9d1d-d6f66b519cba" (UID: "f21ee7a0-18a4-4293-9d1d-d6f66b519cba"). InnerVolumeSpecName "kube-api-access-29qlf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:21:03.714814 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:03.714709 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f21ee7a0-18a4-4293-9d1d-d6f66b519cba-util" (OuterVolumeSpecName: "util") pod "f21ee7a0-18a4-4293-9d1d-d6f66b519cba" (UID: "f21ee7a0-18a4-4293-9d1d-d6f66b519cba"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:21:03.810718 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:03.810670 2562 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f21ee7a0-18a4-4293-9d1d-d6f66b519cba-util\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:21:03.810718 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:03.810690 2562 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f21ee7a0-18a4-4293-9d1d-d6f66b519cba-bundle\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:21:03.810718 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:03.810699 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-29qlf\" (UniqueName: \"kubernetes.io/projected/f21ee7a0-18a4-4293-9d1d-d6f66b519cba-kube-api-access-29qlf\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:21:04.546132 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:04.546105 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8" Apr 16 22:21:04.546320 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:04.546100 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dxbq8" event={"ID":"f21ee7a0-18a4-4293-9d1d-d6f66b519cba","Type":"ContainerDied","Data":"0e7e3ba1de4f8d435cc557d4434a24b9cba298612b64feef19c4f7e100c8853d"} Apr 16 22:21:04.546320 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:04.546217 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e7e3ba1de4f8d435cc557d4434a24b9cba298612b64feef19c4f7e100c8853d" Apr 16 22:21:10.640233 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.640196 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r6pnr"] Apr 16 22:21:10.640592 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.640491 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f21ee7a0-18a4-4293-9d1d-d6f66b519cba" containerName="util" Apr 16 22:21:10.640592 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.640501 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21ee7a0-18a4-4293-9d1d-d6f66b519cba" containerName="util" Apr 16 22:21:10.640592 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.640518 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f21ee7a0-18a4-4293-9d1d-d6f66b519cba" containerName="pull" Apr 16 22:21:10.640592 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.640523 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21ee7a0-18a4-4293-9d1d-d6f66b519cba" containerName="pull" Apr 16 22:21:10.640592 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.640530 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f21ee7a0-18a4-4293-9d1d-d6f66b519cba" containerName="extract" Apr 16 22:21:10.640592 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.640536 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21ee7a0-18a4-4293-9d1d-d6f66b519cba" containerName="extract" Apr 16 22:21:10.640592 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.640575 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="f21ee7a0-18a4-4293-9d1d-d6f66b519cba" containerName="extract" Apr 16 22:21:10.642870 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.642855 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r6pnr" Apr 16 22:21:10.645434 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.645404 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 22:21:10.645551 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.645417 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:21:10.645551 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.645477 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-l2sc4\"" Apr 16 22:21:10.658000 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.657977 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r6pnr"] Apr 16 22:21:10.766163 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.766123 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qrl5\" (UniqueName: \"kubernetes.io/projected/26238120-2c06-4d52-955d-1b240df11b07-kube-api-access-6qrl5\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-r6pnr\" (UID: \"26238120-2c06-4d52-955d-1b240df11b07\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r6pnr" Apr 16 22:21:10.766312 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.766174 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26238120-2c06-4d52-955d-1b240df11b07-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-r6pnr\" (UID: \"26238120-2c06-4d52-955d-1b240df11b07\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r6pnr" Apr 16 22:21:10.867664 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.867587 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6qrl5\" (UniqueName: \"kubernetes.io/projected/26238120-2c06-4d52-955d-1b240df11b07-kube-api-access-6qrl5\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-r6pnr\" (UID: \"26238120-2c06-4d52-955d-1b240df11b07\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r6pnr" Apr 16 22:21:10.867840 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.867686 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26238120-2c06-4d52-955d-1b240df11b07-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-r6pnr\" (UID: \"26238120-2c06-4d52-955d-1b240df11b07\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r6pnr" Apr 16 22:21:10.868065 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.868045 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26238120-2c06-4d52-955d-1b240df11b07-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-r6pnr\" (UID: \"26238120-2c06-4d52-955d-1b240df11b07\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r6pnr" Apr 16 22:21:10.876844 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.876814 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qrl5\" (UniqueName: \"kubernetes.io/projected/26238120-2c06-4d52-955d-1b240df11b07-kube-api-access-6qrl5\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-r6pnr\" (UID: \"26238120-2c06-4d52-955d-1b240df11b07\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r6pnr" Apr 16 22:21:10.951952 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:10.951844 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r6pnr" Apr 16 22:21:11.073834 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:11.073806 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r6pnr"] Apr 16 22:21:11.076539 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:21:11.076511 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26238120_2c06_4d52_955d_1b240df11b07.slice/crio-9d5e287fec6d0f4dc48f6d25170f270d6662cb6ca9b48207a9248bcde9d3d9ac WatchSource:0}: Error finding container 9d5e287fec6d0f4dc48f6d25170f270d6662cb6ca9b48207a9248bcde9d3d9ac: Status 404 returned error can't find the container with id 9d5e287fec6d0f4dc48f6d25170f270d6662cb6ca9b48207a9248bcde9d3d9ac Apr 16 22:21:11.572725 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:11.572692 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r6pnr" event={"ID":"26238120-2c06-4d52-955d-1b240df11b07","Type":"ContainerStarted","Data":"9d5e287fec6d0f4dc48f6d25170f270d6662cb6ca9b48207a9248bcde9d3d9ac"} Apr 16 22:21:13.579855 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:13.579819 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r6pnr" event={"ID":"26238120-2c06-4d52-955d-1b240df11b07","Type":"ContainerStarted","Data":"b077424bcc8b5198fbe21f5515dea12288454aa5db11e2f52046d98a8b087ccd"} Apr 16 22:21:13.600065 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:13.600012 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-r6pnr" podStartSLOduration=1.7248408149999999 podStartE2EDuration="3.599994379s" podCreationTimestamp="2026-04-16 22:21:10 +0000 UTC" firstStartedPulling="2026-04-16 22:21:11.07950759 +0000 UTC m=+463.562026706" lastFinishedPulling="2026-04-16 22:21:12.954661157 +0000 UTC m=+465.437180270" observedRunningTime="2026-04-16 22:21:13.597375555 +0000 UTC m=+466.079894690" watchObservedRunningTime="2026-04-16 22:21:13.599994379 +0000 UTC m=+466.082513515" Apr 16 22:21:19.640333 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.640300 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-wrsfb"] Apr 16 22:21:19.642141 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.642126 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-wrsfb" Apr 16 22:21:19.644277 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.644252 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 22:21:19.644388 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.644363 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 22:21:19.645240 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.645214 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-mphgx\"" Apr 16 22:21:19.653095 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.653062 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-wrsfb"] Apr 16 22:21:19.729246 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.729221 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/093ba837-d5f6-4907-8788-0b4f751a84ac-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-wrsfb\" (UID: \"093ba837-d5f6-4907-8788-0b4f751a84ac\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-wrsfb" Apr 16 22:21:19.729360 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.729277 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7qb5\" (UniqueName: \"kubernetes.io/projected/093ba837-d5f6-4907-8788-0b4f751a84ac-kube-api-access-h7qb5\") pod \"cert-manager-cainjector-8966b78d4-wrsfb\" (UID: \"093ba837-d5f6-4907-8788-0b4f751a84ac\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-wrsfb" Apr 16 22:21:19.830226 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.830180 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7qb5\" (UniqueName: \"kubernetes.io/projected/093ba837-d5f6-4907-8788-0b4f751a84ac-kube-api-access-h7qb5\") pod \"cert-manager-cainjector-8966b78d4-wrsfb\" (UID: \"093ba837-d5f6-4907-8788-0b4f751a84ac\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-wrsfb" Apr 16 22:21:19.830432 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.830275 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/093ba837-d5f6-4907-8788-0b4f751a84ac-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-wrsfb\" (UID: \"093ba837-d5f6-4907-8788-0b4f751a84ac\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-wrsfb" Apr 16 22:21:19.837937 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.837913 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/093ba837-d5f6-4907-8788-0b4f751a84ac-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-wrsfb\" (UID: \"093ba837-d5f6-4907-8788-0b4f751a84ac\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-wrsfb" Apr 16 22:21:19.837995 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.837931 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7qb5\" (UniqueName: \"kubernetes.io/projected/093ba837-d5f6-4907-8788-0b4f751a84ac-kube-api-access-h7qb5\") pod \"cert-manager-cainjector-8966b78d4-wrsfb\" (UID: \"093ba837-d5f6-4907-8788-0b4f751a84ac\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-wrsfb" Apr 16 22:21:19.941024 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.940940 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb"] Apr 16 22:21:19.943045 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.943029 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb" Apr 16 22:21:19.945294 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.945271 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 22:21:19.945389 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.945309 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 22:21:19.945389 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.945271 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rwfcd\"" Apr 16 22:21:19.950456 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.950430 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-wrsfb" Apr 16 22:21:19.950815 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:19.950783 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb"] Apr 16 22:21:20.032122 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:20.032091 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ff9ebc22-302d-4515-8380-67f872ff0894-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb\" (UID: \"ff9ebc22-302d-4515-8380-67f872ff0894\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb" Apr 16 22:21:20.032279 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:20.032157 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ff9ebc22-302d-4515-8380-67f872ff0894-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb\" (UID: \"ff9ebc22-302d-4515-8380-67f872ff0894\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb" Apr 16 22:21:20.032279 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:20.032192 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wshvw\" (UniqueName: \"kubernetes.io/projected/ff9ebc22-302d-4515-8380-67f872ff0894-kube-api-access-wshvw\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb\" (UID: \"ff9ebc22-302d-4515-8380-67f872ff0894\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb" Apr 16 22:21:20.070411 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:20.070381 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-wrsfb"] Apr 16 22:21:20.072959 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:21:20.072929 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod093ba837_d5f6_4907_8788_0b4f751a84ac.slice/crio-2291d46aef835dd75b537a9841daee25a77d8837b77ff68c1327f3eecb080038 WatchSource:0}: Error finding container 2291d46aef835dd75b537a9841daee25a77d8837b77ff68c1327f3eecb080038: Status 404 returned error can't find the container with id 2291d46aef835dd75b537a9841daee25a77d8837b77ff68c1327f3eecb080038 Apr 16 22:21:20.133504 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:20.133467 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ff9ebc22-302d-4515-8380-67f872ff0894-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb\" (UID: \"ff9ebc22-302d-4515-8380-67f872ff0894\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb" Apr 16 22:21:20.133682 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:20.133510 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wshvw\" (UniqueName: \"kubernetes.io/projected/ff9ebc22-302d-4515-8380-67f872ff0894-kube-api-access-wshvw\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb\" (UID: \"ff9ebc22-302d-4515-8380-67f872ff0894\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb" Apr 16 22:21:20.133682 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:20.133593 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ff9ebc22-302d-4515-8380-67f872ff0894-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb\" (UID: \"ff9ebc22-302d-4515-8380-67f872ff0894\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb" Apr 16 22:21:20.133842 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:20.133822 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ff9ebc22-302d-4515-8380-67f872ff0894-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb\" (UID: \"ff9ebc22-302d-4515-8380-67f872ff0894\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb" Apr 16 22:21:20.133936 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:20.133918 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ff9ebc22-302d-4515-8380-67f872ff0894-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb\" (UID: \"ff9ebc22-302d-4515-8380-67f872ff0894\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb" Apr 16 22:21:20.142388 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:20.142360 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wshvw\" (UniqueName: \"kubernetes.io/projected/ff9ebc22-302d-4515-8380-67f872ff0894-kube-api-access-wshvw\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb\" (UID: \"ff9ebc22-302d-4515-8380-67f872ff0894\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb" Apr 16 22:21:20.253378 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:20.253293 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb" Apr 16 22:21:20.369384 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:20.369361 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb"] Apr 16 22:21:20.371528 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:21:20.371508 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff9ebc22_302d_4515_8380_67f872ff0894.slice/crio-b4dd884cf5b9dc2b8c48343b3c7800b0abc9b8ebc3b08070e6de459a9a0a8734 WatchSource:0}: Error finding container b4dd884cf5b9dc2b8c48343b3c7800b0abc9b8ebc3b08070e6de459a9a0a8734: Status 404 returned error can't find the container with id b4dd884cf5b9dc2b8c48343b3c7800b0abc9b8ebc3b08070e6de459a9a0a8734 Apr 16 22:21:20.613224 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:20.613187 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-wrsfb" event={"ID":"093ba837-d5f6-4907-8788-0b4f751a84ac","Type":"ContainerStarted","Data":"2291d46aef835dd75b537a9841daee25a77d8837b77ff68c1327f3eecb080038"} Apr 16 22:21:20.616988 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:20.616962 2562 generic.go:358] "Generic (PLEG): container finished" podID="ff9ebc22-302d-4515-8380-67f872ff0894" containerID="0f37a005d7ee25441c3c252d20dcc73c41e38c0b4fc7d9555d29691044e3292c" exitCode=0 Apr 16 22:21:20.617109 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:20.616994 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb" event={"ID":"ff9ebc22-302d-4515-8380-67f872ff0894","Type":"ContainerDied","Data":"0f37a005d7ee25441c3c252d20dcc73c41e38c0b4fc7d9555d29691044e3292c"} Apr 16 22:21:20.617109 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:20.617011 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb" event={"ID":"ff9ebc22-302d-4515-8380-67f872ff0894","Type":"ContainerStarted","Data":"b4dd884cf5b9dc2b8c48343b3c7800b0abc9b8ebc3b08070e6de459a9a0a8734"} Apr 16 22:21:24.638598 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:24.638553 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-wrsfb" event={"ID":"093ba837-d5f6-4907-8788-0b4f751a84ac","Type":"ContainerStarted","Data":"991bb7d91a396dea92ef0d26355fb5f1b7df567643cabd432f92d1b3502ecc97"} Apr 16 22:21:24.640140 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:24.640109 2562 generic.go:358] "Generic (PLEG): container finished" podID="ff9ebc22-302d-4515-8380-67f872ff0894" containerID="23bad2a48304764780dc1ae2ef1e4de35a4856019cbafb66a9231c1918865bd3" exitCode=0 Apr 16 22:21:24.640251 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:24.640155 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb" event={"ID":"ff9ebc22-302d-4515-8380-67f872ff0894","Type":"ContainerDied","Data":"23bad2a48304764780dc1ae2ef1e4de35a4856019cbafb66a9231c1918865bd3"} Apr 16 22:21:24.653005 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:24.652963 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-wrsfb" podStartSLOduration=2.114020193 podStartE2EDuration="5.652946604s" podCreationTimestamp="2026-04-16 22:21:19 +0000 UTC" firstStartedPulling="2026-04-16 22:21:20.0746285 +0000 UTC m=+472.557147613" lastFinishedPulling="2026-04-16 22:21:23.613554908 +0000 UTC m=+476.096074024" observedRunningTime="2026-04-16 22:21:24.652179182 +0000 UTC m=+477.134698318" watchObservedRunningTime="2026-04-16 22:21:24.652946604 +0000 UTC m=+477.135465740" Apr 16 22:21:25.645655 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:25.645599 2562 generic.go:358] "Generic (PLEG): container finished" podID="ff9ebc22-302d-4515-8380-67f872ff0894" containerID="cc3627458fe33f224e488b947e03080539f57814f4e9c18d8fc4329850b8e4a6" exitCode=0 Apr 16 22:21:25.645655 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:25.645642 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb" event={"ID":"ff9ebc22-302d-4515-8380-67f872ff0894","Type":"ContainerDied","Data":"cc3627458fe33f224e488b947e03080539f57814f4e9c18d8fc4329850b8e4a6"} Apr 16 22:21:26.769666 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:26.769643 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb" Apr 16 22:21:26.889524 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:26.889493 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wshvw\" (UniqueName: \"kubernetes.io/projected/ff9ebc22-302d-4515-8380-67f872ff0894-kube-api-access-wshvw\") pod \"ff9ebc22-302d-4515-8380-67f872ff0894\" (UID: \"ff9ebc22-302d-4515-8380-67f872ff0894\") " Apr 16 22:21:26.889656 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:26.889583 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ff9ebc22-302d-4515-8380-67f872ff0894-util\") pod \"ff9ebc22-302d-4515-8380-67f872ff0894\" (UID: \"ff9ebc22-302d-4515-8380-67f872ff0894\") " Apr 16 22:21:26.889656 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:26.889627 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ff9ebc22-302d-4515-8380-67f872ff0894-bundle\") pod \"ff9ebc22-302d-4515-8380-67f872ff0894\" (UID: \"ff9ebc22-302d-4515-8380-67f872ff0894\") " Apr 16 22:21:26.890041 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:26.890011 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff9ebc22-302d-4515-8380-67f872ff0894-bundle" (OuterVolumeSpecName: "bundle") pod "ff9ebc22-302d-4515-8380-67f872ff0894" (UID: "ff9ebc22-302d-4515-8380-67f872ff0894"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:21:26.891597 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:26.891568 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff9ebc22-302d-4515-8380-67f872ff0894-kube-api-access-wshvw" (OuterVolumeSpecName: "kube-api-access-wshvw") pod "ff9ebc22-302d-4515-8380-67f872ff0894" (UID: "ff9ebc22-302d-4515-8380-67f872ff0894"). InnerVolumeSpecName "kube-api-access-wshvw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:21:26.893857 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:26.893830 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff9ebc22-302d-4515-8380-67f872ff0894-util" (OuterVolumeSpecName: "util") pod "ff9ebc22-302d-4515-8380-67f872ff0894" (UID: "ff9ebc22-302d-4515-8380-67f872ff0894"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:21:26.990704 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:26.990650 2562 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ff9ebc22-302d-4515-8380-67f872ff0894-util\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:21:26.990704 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:26.990672 2562 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ff9ebc22-302d-4515-8380-67f872ff0894-bundle\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:21:26.990704 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:26.990681 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wshvw\" (UniqueName: \"kubernetes.io/projected/ff9ebc22-302d-4515-8380-67f872ff0894-kube-api-access-wshvw\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:21:27.653561 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:27.653523 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb" event={"ID":"ff9ebc22-302d-4515-8380-67f872ff0894","Type":"ContainerDied","Data":"b4dd884cf5b9dc2b8c48343b3c7800b0abc9b8ebc3b08070e6de459a9a0a8734"} Apr 16 22:21:27.653561 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:27.653563 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4dd884cf5b9dc2b8c48343b3c7800b0abc9b8ebc3b08070e6de459a9a0a8734" Apr 16 22:21:27.653776 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:27.653599 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4gxxb" Apr 16 22:21:32.044385 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.044342 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-rh468"] Apr 16 22:21:32.044891 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.044685 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff9ebc22-302d-4515-8380-67f872ff0894" containerName="pull" Apr 16 22:21:32.044891 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.044698 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9ebc22-302d-4515-8380-67f872ff0894" containerName="pull" Apr 16 22:21:32.044891 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.044710 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff9ebc22-302d-4515-8380-67f872ff0894" containerName="util" Apr 16 22:21:32.044891 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.044717 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9ebc22-302d-4515-8380-67f872ff0894" containerName="util" Apr 16 22:21:32.044891 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.044722 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff9ebc22-302d-4515-8380-67f872ff0894" containerName="extract" Apr 16 22:21:32.044891 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.044728 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9ebc22-302d-4515-8380-67f872ff0894" containerName="extract" Apr 16 22:21:32.044891 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.044787 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff9ebc22-302d-4515-8380-67f872ff0894" containerName="extract" Apr 16 22:21:32.049126 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.049108 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rh468" Apr 16 22:21:32.051486 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.051459 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:21:32.051642 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.051493 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 22:21:32.052141 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.052122 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-mwvfg\"" Apr 16 22:21:32.054987 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.054960 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-rh468"] Apr 16 22:21:32.236457 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.236429 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f5e8b5a8-1a99-4a2e-aaba-ee4be6a17ed3-tmp\") pod \"openshift-lws-operator-bfc7f696d-rh468\" (UID: \"f5e8b5a8-1a99-4a2e-aaba-ee4be6a17ed3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rh468" Apr 16 22:21:32.236640 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.236520 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv96w\" (UniqueName: \"kubernetes.io/projected/f5e8b5a8-1a99-4a2e-aaba-ee4be6a17ed3-kube-api-access-nv96w\") pod \"openshift-lws-operator-bfc7f696d-rh468\" (UID: \"f5e8b5a8-1a99-4a2e-aaba-ee4be6a17ed3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rh468" Apr 16 22:21:32.337272 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.337212 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nv96w\" (UniqueName: \"kubernetes.io/projected/f5e8b5a8-1a99-4a2e-aaba-ee4be6a17ed3-kube-api-access-nv96w\") pod \"openshift-lws-operator-bfc7f696d-rh468\" (UID: \"f5e8b5a8-1a99-4a2e-aaba-ee4be6a17ed3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rh468" Apr 16 22:21:32.337272 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.337252 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f5e8b5a8-1a99-4a2e-aaba-ee4be6a17ed3-tmp\") pod \"openshift-lws-operator-bfc7f696d-rh468\" (UID: \"f5e8b5a8-1a99-4a2e-aaba-ee4be6a17ed3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rh468" Apr 16 22:21:32.337667 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.337650 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f5e8b5a8-1a99-4a2e-aaba-ee4be6a17ed3-tmp\") pod \"openshift-lws-operator-bfc7f696d-rh468\" (UID: \"f5e8b5a8-1a99-4a2e-aaba-ee4be6a17ed3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rh468" Apr 16 22:21:32.344683 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.344656 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv96w\" (UniqueName: \"kubernetes.io/projected/f5e8b5a8-1a99-4a2e-aaba-ee4be6a17ed3-kube-api-access-nv96w\") pod \"openshift-lws-operator-bfc7f696d-rh468\" (UID: \"f5e8b5a8-1a99-4a2e-aaba-ee4be6a17ed3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rh468" Apr 16 22:21:32.360424 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.360405 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rh468" Apr 16 22:21:32.476731 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.476708 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-rh468"] Apr 16 22:21:32.479057 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:21:32.479031 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5e8b5a8_1a99_4a2e_aaba_ee4be6a17ed3.slice/crio-e0f03d6f2a8c6badd1ef24ec8b8b185d4649ae71ad7e8236e0498246e7229b83 WatchSource:0}: Error finding container e0f03d6f2a8c6badd1ef24ec8b8b185d4649ae71ad7e8236e0498246e7229b83: Status 404 returned error can't find the container with id e0f03d6f2a8c6badd1ef24ec8b8b185d4649ae71ad7e8236e0498246e7229b83 Apr 16 22:21:32.671006 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:32.670941 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rh468" event={"ID":"f5e8b5a8-1a99-4a2e-aaba-ee4be6a17ed3","Type":"ContainerStarted","Data":"e0f03d6f2a8c6badd1ef24ec8b8b185d4649ae71ad7e8236e0498246e7229b83"} Apr 16 22:21:34.678902 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:34.678863 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rh468" event={"ID":"f5e8b5a8-1a99-4a2e-aaba-ee4be6a17ed3","Type":"ContainerStarted","Data":"34024f381797a7f4e4c5f745c620361dc01d68a36f63ce525bff999e409bf792"} Apr 16 22:21:34.695413 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:34.695359 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rh468" podStartSLOduration=0.653288399 podStartE2EDuration="2.695341303s" podCreationTimestamp="2026-04-16 22:21:32 +0000 UTC" firstStartedPulling="2026-04-16 22:21:32.480496428 +0000 UTC m=+484.963015543" lastFinishedPulling="2026-04-16 22:21:34.522549331 +0000 UTC m=+487.005068447" observedRunningTime="2026-04-16 22:21:34.692586172 +0000 UTC m=+487.175105320" watchObservedRunningTime="2026-04-16 22:21:34.695341303 +0000 UTC m=+487.177860440" Apr 16 22:21:56.591257 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:56.591214 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897"] Apr 16 22:21:56.593910 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:56.593887 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897" Apr 16 22:21:56.596158 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:56.596128 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rwfcd\"" Apr 16 22:21:56.596276 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:56.596135 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 22:21:56.596997 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:56.596975 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 22:21:56.602308 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:56.602277 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897"] Apr 16 22:21:56.605113 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:56.605086 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d97fc93f-dee6-477a-8a90-1a0e5d7741a9-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897\" (UID: \"d97fc93f-dee6-477a-8a90-1a0e5d7741a9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897" Apr 16 22:21:56.605212 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:56.605150 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d97fc93f-dee6-477a-8a90-1a0e5d7741a9-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897\" (UID: \"d97fc93f-dee6-477a-8a90-1a0e5d7741a9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897" Apr 16 22:21:56.605212 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:56.605190 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpnl2\" (UniqueName: \"kubernetes.io/projected/d97fc93f-dee6-477a-8a90-1a0e5d7741a9-kube-api-access-qpnl2\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897\" (UID: \"d97fc93f-dee6-477a-8a90-1a0e5d7741a9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897" Apr 16 22:21:56.705996 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:56.705966 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d97fc93f-dee6-477a-8a90-1a0e5d7741a9-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897\" (UID: \"d97fc93f-dee6-477a-8a90-1a0e5d7741a9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897" Apr 16 22:21:56.706107 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:56.706015 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d97fc93f-dee6-477a-8a90-1a0e5d7741a9-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897\" (UID: \"d97fc93f-dee6-477a-8a90-1a0e5d7741a9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897" Apr 16 22:21:56.706107 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:56.706053 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpnl2\" (UniqueName: \"kubernetes.io/projected/d97fc93f-dee6-477a-8a90-1a0e5d7741a9-kube-api-access-qpnl2\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897\" (UID: \"d97fc93f-dee6-477a-8a90-1a0e5d7741a9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897" Apr 16 22:21:56.706450 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:56.706432 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d97fc93f-dee6-477a-8a90-1a0e5d7741a9-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897\" (UID: \"d97fc93f-dee6-477a-8a90-1a0e5d7741a9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897" Apr 16 22:21:56.706488 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:56.706448 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d97fc93f-dee6-477a-8a90-1a0e5d7741a9-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897\" (UID: \"d97fc93f-dee6-477a-8a90-1a0e5d7741a9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897" Apr 16 22:21:56.713973 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:56.713937 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpnl2\" (UniqueName: \"kubernetes.io/projected/d97fc93f-dee6-477a-8a90-1a0e5d7741a9-kube-api-access-qpnl2\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897\" (UID: \"d97fc93f-dee6-477a-8a90-1a0e5d7741a9\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897" Apr 16 22:21:56.904457 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:56.904394 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897" Apr 16 22:21:57.040010 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:57.039983 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897"] Apr 16 22:21:57.041983 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:21:57.041954 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd97fc93f_dee6_477a_8a90_1a0e5d7741a9.slice/crio-bf6d0c2ac169f103ec1d43df6321c8689e2f0c650cf6719101be4da2a9572dd3 WatchSource:0}: Error finding container bf6d0c2ac169f103ec1d43df6321c8689e2f0c650cf6719101be4da2a9572dd3: Status 404 returned error can't find the container with id bf6d0c2ac169f103ec1d43df6321c8689e2f0c650cf6719101be4da2a9572dd3 Apr 16 22:21:57.753369 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:57.753338 2562 generic.go:358] "Generic (PLEG): container finished" podID="d97fc93f-dee6-477a-8a90-1a0e5d7741a9" containerID="386523cd14840f97a84fbdc8cb626b2a0389ae5a9c8e92bee0ebab700c8c25d3" exitCode=0 Apr 16 22:21:57.753782 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:57.753405 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897" event={"ID":"d97fc93f-dee6-477a-8a90-1a0e5d7741a9","Type":"ContainerDied","Data":"386523cd14840f97a84fbdc8cb626b2a0389ae5a9c8e92bee0ebab700c8c25d3"} Apr 16 22:21:57.753782 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:57.753434 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897" event={"ID":"d97fc93f-dee6-477a-8a90-1a0e5d7741a9","Type":"ContainerStarted","Data":"bf6d0c2ac169f103ec1d43df6321c8689e2f0c650cf6719101be4da2a9572dd3"} Apr 16 22:21:58.759287 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:58.758989 2562 generic.go:358] "Generic (PLEG): container finished" podID="d97fc93f-dee6-477a-8a90-1a0e5d7741a9" containerID="0e88837fb1e26693f5b9fe9a04ffab198ae2abf862ed702be077dadf495c26bd" exitCode=0 Apr 16 22:21:58.760514 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:58.760484 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897" event={"ID":"d97fc93f-dee6-477a-8a90-1a0e5d7741a9","Type":"ContainerDied","Data":"0e88837fb1e26693f5b9fe9a04ffab198ae2abf862ed702be077dadf495c26bd"} Apr 16 22:21:59.765158 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:59.765123 2562 generic.go:358] "Generic (PLEG): container finished" podID="d97fc93f-dee6-477a-8a90-1a0e5d7741a9" containerID="64415b3d99c90ac8e917c2c6a64abb093dc012ce569e2febbc22587fc397988b" exitCode=0 Apr 16 22:21:59.765544 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:21:59.765203 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897" event={"ID":"d97fc93f-dee6-477a-8a90-1a0e5d7741a9","Type":"ContainerDied","Data":"64415b3d99c90ac8e917c2c6a64abb093dc012ce569e2febbc22587fc397988b"} Apr 16 22:22:00.887788 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:00.887766 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897" Apr 16 22:22:00.940691 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:00.940647 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d97fc93f-dee6-477a-8a90-1a0e5d7741a9-bundle\") pod \"d97fc93f-dee6-477a-8a90-1a0e5d7741a9\" (UID: \"d97fc93f-dee6-477a-8a90-1a0e5d7741a9\") " Apr 16 22:22:00.940691 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:00.940705 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d97fc93f-dee6-477a-8a90-1a0e5d7741a9-util\") pod \"d97fc93f-dee6-477a-8a90-1a0e5d7741a9\" (UID: \"d97fc93f-dee6-477a-8a90-1a0e5d7741a9\") " Apr 16 22:22:00.940934 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:00.940750 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpnl2\" (UniqueName: \"kubernetes.io/projected/d97fc93f-dee6-477a-8a90-1a0e5d7741a9-kube-api-access-qpnl2\") pod \"d97fc93f-dee6-477a-8a90-1a0e5d7741a9\" (UID: \"d97fc93f-dee6-477a-8a90-1a0e5d7741a9\") " Apr 16 22:22:00.941669 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:00.941645 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d97fc93f-dee6-477a-8a90-1a0e5d7741a9-bundle" (OuterVolumeSpecName: "bundle") pod "d97fc93f-dee6-477a-8a90-1a0e5d7741a9" (UID: "d97fc93f-dee6-477a-8a90-1a0e5d7741a9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:22:00.942815 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:00.942790 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d97fc93f-dee6-477a-8a90-1a0e5d7741a9-kube-api-access-qpnl2" (OuterVolumeSpecName: "kube-api-access-qpnl2") pod "d97fc93f-dee6-477a-8a90-1a0e5d7741a9" (UID: "d97fc93f-dee6-477a-8a90-1a0e5d7741a9"). InnerVolumeSpecName "kube-api-access-qpnl2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:22:00.946631 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:00.946585 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d97fc93f-dee6-477a-8a90-1a0e5d7741a9-util" (OuterVolumeSpecName: "util") pod "d97fc93f-dee6-477a-8a90-1a0e5d7741a9" (UID: "d97fc93f-dee6-477a-8a90-1a0e5d7741a9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:22:01.042157 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:01.042134 2562 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d97fc93f-dee6-477a-8a90-1a0e5d7741a9-bundle\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:22:01.042157 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:01.042155 2562 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d97fc93f-dee6-477a-8a90-1a0e5d7741a9-util\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:22:01.042284 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:01.042165 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qpnl2\" (UniqueName: \"kubernetes.io/projected/d97fc93f-dee6-477a-8a90-1a0e5d7741a9-kube-api-access-qpnl2\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:22:01.774470 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:01.774436 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897" Apr 16 22:22:01.774691 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:01.774436 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835sg897" event={"ID":"d97fc93f-dee6-477a-8a90-1a0e5d7741a9","Type":"ContainerDied","Data":"bf6d0c2ac169f103ec1d43df6321c8689e2f0c650cf6719101be4da2a9572dd3"} Apr 16 22:22:01.774691 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:01.774553 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf6d0c2ac169f103ec1d43df6321c8689e2f0c650cf6719101be4da2a9572dd3" Apr 16 22:22:10.745357 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.745318 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h"] Apr 16 22:22:10.745793 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.745628 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d97fc93f-dee6-477a-8a90-1a0e5d7741a9" containerName="util" Apr 16 22:22:10.745793 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.745640 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97fc93f-dee6-477a-8a90-1a0e5d7741a9" containerName="util" Apr 16 22:22:10.745793 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.745659 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d97fc93f-dee6-477a-8a90-1a0e5d7741a9" containerName="pull" Apr 16 22:22:10.745793 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.745666 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97fc93f-dee6-477a-8a90-1a0e5d7741a9" containerName="pull" Apr 16 22:22:10.745793 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.745671 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d97fc93f-dee6-477a-8a90-1a0e5d7741a9" containerName="extract" Apr 16 22:22:10.745793 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.745677 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97fc93f-dee6-477a-8a90-1a0e5d7741a9" containerName="extract" Apr 16 22:22:10.745793 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.745722 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="d97fc93f-dee6-477a-8a90-1a0e5d7741a9" containerName="extract" Apr 16 22:22:10.748437 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.748418 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h" Apr 16 22:22:10.750924 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.750900 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rwfcd\"" Apr 16 22:22:10.750986 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.750922 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 22:22:10.751735 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.751719 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 22:22:10.757818 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.757792 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h"] Apr 16 22:22:10.816057 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.816024 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xltl\" (UniqueName: \"kubernetes.io/projected/d9ed1726-c8b9-427b-b3e1-3a2d97e705bd-kube-api-access-9xltl\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h\" (UID: \"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h" Apr 16 22:22:10.816211 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.816076 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d9ed1726-c8b9-427b-b3e1-3a2d97e705bd-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h\" (UID: \"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h" Apr 16 22:22:10.816211 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.816182 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d9ed1726-c8b9-427b-b3e1-3a2d97e705bd-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h\" (UID: \"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h" Apr 16 22:22:10.917074 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.917041 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xltl\" (UniqueName: \"kubernetes.io/projected/d9ed1726-c8b9-427b-b3e1-3a2d97e705bd-kube-api-access-9xltl\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h\" (UID: \"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h" Apr 16 22:22:10.917200 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.917086 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d9ed1726-c8b9-427b-b3e1-3a2d97e705bd-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h\" (UID: \"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h" Apr 16 22:22:10.917200 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.917132 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d9ed1726-c8b9-427b-b3e1-3a2d97e705bd-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h\" (UID: \"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h" Apr 16 22:22:10.917516 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.917497 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d9ed1726-c8b9-427b-b3e1-3a2d97e705bd-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h\" (UID: \"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h" Apr 16 22:22:10.917558 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.917538 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d9ed1726-c8b9-427b-b3e1-3a2d97e705bd-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h\" (UID: \"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h" Apr 16 22:22:10.929389 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:10.929363 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xltl\" (UniqueName: \"kubernetes.io/projected/d9ed1726-c8b9-427b-b3e1-3a2d97e705bd-kube-api-access-9xltl\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h\" (UID: \"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h" Apr 16 22:22:11.057713 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:11.057680 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h" Apr 16 22:22:11.187831 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:11.187806 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h"] Apr 16 22:22:11.190340 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:22:11.190309 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9ed1726_c8b9_427b_b3e1_3a2d97e705bd.slice/crio-d3e24d51638ce0efccfc541a453274b824a4c40270f933ac18830f7ab8208339 WatchSource:0}: Error finding container d3e24d51638ce0efccfc541a453274b824a4c40270f933ac18830f7ab8208339: Status 404 returned error can't find the container with id d3e24d51638ce0efccfc541a453274b824a4c40270f933ac18830f7ab8208339 Apr 16 22:22:11.810108 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:11.810076 2562 generic.go:358] "Generic (PLEG): container finished" podID="d9ed1726-c8b9-427b-b3e1-3a2d97e705bd" containerID="963cd0f5db904ba844078e0a9c7d90593f3790958e8ac6845ff8dd6751f6916d" exitCode=0 Apr 16 22:22:11.810482 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:11.810164 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h" event={"ID":"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd","Type":"ContainerDied","Data":"963cd0f5db904ba844078e0a9c7d90593f3790958e8ac6845ff8dd6751f6916d"} Apr 16 22:22:11.810482 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:11.810204 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h" event={"ID":"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd","Type":"ContainerStarted","Data":"d3e24d51638ce0efccfc541a453274b824a4c40270f933ac18830f7ab8208339"} Apr 16 22:22:12.090308 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:12.090224 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-8c8zh"] Apr 16 22:22:12.092253 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:12.092236 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-8c8zh" Apr 16 22:22:12.094981 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:12.094958 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 22:22:12.094981 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:12.094975 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-95dsc\"" Apr 16 22:22:12.095282 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:12.095258 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 22:22:12.108887 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:12.108858 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-8c8zh"] Apr 16 22:22:12.127383 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:12.127349 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9kdj\" (UniqueName: \"kubernetes.io/projected/6e44cbb8-6813-4bc3-ac3c-fd3789237eab-kube-api-access-v9kdj\") pod \"servicemesh-operator3-55f49c5f94-8c8zh\" (UID: \"6e44cbb8-6813-4bc3-ac3c-fd3789237eab\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-8c8zh" Apr 16 22:22:12.127519 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:12.127408 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6e44cbb8-6813-4bc3-ac3c-fd3789237eab-operator-config\") pod \"servicemesh-operator3-55f49c5f94-8c8zh\" (UID: \"6e44cbb8-6813-4bc3-ac3c-fd3789237eab\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-8c8zh" Apr 16 22:22:12.228081 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:12.228054 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9kdj\" (UniqueName: \"kubernetes.io/projected/6e44cbb8-6813-4bc3-ac3c-fd3789237eab-kube-api-access-v9kdj\") pod \"servicemesh-operator3-55f49c5f94-8c8zh\" (UID: \"6e44cbb8-6813-4bc3-ac3c-fd3789237eab\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-8c8zh" Apr 16 22:22:12.228250 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:12.228087 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6e44cbb8-6813-4bc3-ac3c-fd3789237eab-operator-config\") pod \"servicemesh-operator3-55f49c5f94-8c8zh\" (UID: \"6e44cbb8-6813-4bc3-ac3c-fd3789237eab\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-8c8zh" Apr 16 22:22:12.230370 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:12.230347 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6e44cbb8-6813-4bc3-ac3c-fd3789237eab-operator-config\") pod \"servicemesh-operator3-55f49c5f94-8c8zh\" (UID: \"6e44cbb8-6813-4bc3-ac3c-fd3789237eab\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-8c8zh" Apr 16 22:22:12.237509 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:12.237491 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9kdj\" (UniqueName: \"kubernetes.io/projected/6e44cbb8-6813-4bc3-ac3c-fd3789237eab-kube-api-access-v9kdj\") pod \"servicemesh-operator3-55f49c5f94-8c8zh\" (UID: \"6e44cbb8-6813-4bc3-ac3c-fd3789237eab\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-8c8zh" Apr 16 22:22:12.405270 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:12.405206 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-8c8zh" Apr 16 22:22:12.524835 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:12.524810 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-8c8zh"] Apr 16 22:22:12.527051 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:22:12.527023 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e44cbb8_6813_4bc3_ac3c_fd3789237eab.slice/crio-c4392b2a14bbc27d88e4825981572e3622f757752644a675734aae0b43cba223 WatchSource:0}: Error finding container c4392b2a14bbc27d88e4825981572e3622f757752644a675734aae0b43cba223: Status 404 returned error can't find the container with id c4392b2a14bbc27d88e4825981572e3622f757752644a675734aae0b43cba223 Apr 16 22:22:12.815080 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:12.815051 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-8c8zh" event={"ID":"6e44cbb8-6813-4bc3-ac3c-fd3789237eab","Type":"ContainerStarted","Data":"c4392b2a14bbc27d88e4825981572e3622f757752644a675734aae0b43cba223"} Apr 16 22:22:13.820994 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:13.820964 2562 generic.go:358] "Generic (PLEG): container finished" podID="d9ed1726-c8b9-427b-b3e1-3a2d97e705bd" containerID="f4f010e893ea6469d14d2536fb23080cf103a10a40789f8a6e46cf18fb318522" exitCode=0 Apr 16 22:22:13.821391 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:13.821002 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h" event={"ID":"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd","Type":"ContainerDied","Data":"f4f010e893ea6469d14d2536fb23080cf103a10a40789f8a6e46cf18fb318522"} Apr 16 22:22:14.829095 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:14.829056 2562 generic.go:358] "Generic (PLEG): container finished" podID="d9ed1726-c8b9-427b-b3e1-3a2d97e705bd" containerID="32b5d70c5ab9522ce44f9ce97c63ac1b3b8b8612b6f9e5fd9a9aaef61e0ba842" exitCode=0 Apr 16 22:22:14.829468 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:14.829103 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h" event={"ID":"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd","Type":"ContainerDied","Data":"32b5d70c5ab9522ce44f9ce97c63ac1b3b8b8612b6f9e5fd9a9aaef61e0ba842"} Apr 16 22:22:16.006979 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:16.006958 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h" Apr 16 22:22:16.059429 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:16.059404 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d9ed1726-c8b9-427b-b3e1-3a2d97e705bd-util\") pod \"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd\" (UID: \"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd\") " Apr 16 22:22:16.059530 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:16.059472 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xltl\" (UniqueName: \"kubernetes.io/projected/d9ed1726-c8b9-427b-b3e1-3a2d97e705bd-kube-api-access-9xltl\") pod \"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd\" (UID: \"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd\") " Apr 16 22:22:16.059530 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:16.059492 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d9ed1726-c8b9-427b-b3e1-3a2d97e705bd-bundle\") pod \"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd\" (UID: \"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd\") " Apr 16 22:22:16.060715 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:16.060688 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9ed1726-c8b9-427b-b3e1-3a2d97e705bd-bundle" (OuterVolumeSpecName: "bundle") pod "d9ed1726-c8b9-427b-b3e1-3a2d97e705bd" (UID: "d9ed1726-c8b9-427b-b3e1-3a2d97e705bd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:22:16.061293 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:16.061265 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9ed1726-c8b9-427b-b3e1-3a2d97e705bd-kube-api-access-9xltl" (OuterVolumeSpecName: "kube-api-access-9xltl") pod "d9ed1726-c8b9-427b-b3e1-3a2d97e705bd" (UID: "d9ed1726-c8b9-427b-b3e1-3a2d97e705bd"). InnerVolumeSpecName "kube-api-access-9xltl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:22:16.067025 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:16.066982 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9ed1726-c8b9-427b-b3e1-3a2d97e705bd-util" (OuterVolumeSpecName: "util") pod "d9ed1726-c8b9-427b-b3e1-3a2d97e705bd" (UID: "d9ed1726-c8b9-427b-b3e1-3a2d97e705bd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:22:16.160257 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:16.160223 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9xltl\" (UniqueName: \"kubernetes.io/projected/d9ed1726-c8b9-427b-b3e1-3a2d97e705bd-kube-api-access-9xltl\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:22:16.160468 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:16.160442 2562 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d9ed1726-c8b9-427b-b3e1-3a2d97e705bd-bundle\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:22:16.160468 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:16.160469 2562 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d9ed1726-c8b9-427b-b3e1-3a2d97e705bd-util\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:22:16.840019 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:16.839923 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h" event={"ID":"d9ed1726-c8b9-427b-b3e1-3a2d97e705bd","Type":"ContainerDied","Data":"d3e24d51638ce0efccfc541a453274b824a4c40270f933ac18830f7ab8208339"} Apr 16 22:22:16.840019 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:16.839959 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ps98h" Apr 16 22:22:16.840275 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:16.839967 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3e24d51638ce0efccfc541a453274b824a4c40270f933ac18830f7ab8208339" Apr 16 22:22:16.841851 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:16.841816 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-8c8zh" event={"ID":"6e44cbb8-6813-4bc3-ac3c-fd3789237eab","Type":"ContainerStarted","Data":"8797698ee740f8d967edf2466c4331ed98ad2093ed8ac5a0a970a31321caf482"} Apr 16 22:22:16.842041 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:16.842020 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-8c8zh" Apr 16 22:22:16.863038 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:16.862989 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-8c8zh" podStartSLOduration=1.458338561 podStartE2EDuration="4.862978357s" podCreationTimestamp="2026-04-16 22:22:12 +0000 UTC" firstStartedPulling="2026-04-16 22:22:12.529413367 +0000 UTC m=+525.011932484" lastFinishedPulling="2026-04-16 22:22:15.934053164 +0000 UTC m=+528.416572280" observedRunningTime="2026-04-16 22:22:16.860172262 +0000 UTC m=+529.342691414" watchObservedRunningTime="2026-04-16 22:22:16.862978357 +0000 UTC m=+529.345497492" Apr 16 22:22:18.058892 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.058855 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9"] Apr 16 22:22:18.059324 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.059264 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9ed1726-c8b9-427b-b3e1-3a2d97e705bd" containerName="extract" Apr 16 22:22:18.059324 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.059281 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ed1726-c8b9-427b-b3e1-3a2d97e705bd" containerName="extract" Apr 16 22:22:18.059324 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.059300 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9ed1726-c8b9-427b-b3e1-3a2d97e705bd" containerName="util" Apr 16 22:22:18.059324 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.059308 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ed1726-c8b9-427b-b3e1-3a2d97e705bd" containerName="util" Apr 16 22:22:18.059324 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.059321 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9ed1726-c8b9-427b-b3e1-3a2d97e705bd" containerName="pull" Apr 16 22:22:18.059558 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.059329 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ed1726-c8b9-427b-b3e1-3a2d97e705bd" containerName="pull" Apr 16 22:22:18.059558 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.059438 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9ed1726-c8b9-427b-b3e1-3a2d97e705bd" containerName="extract" Apr 16 22:22:18.061278 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.061258 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.063517 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.063487 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 22:22:18.063709 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.063691 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 22:22:18.063801 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.063755 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 22:22:18.063856 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.063821 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 22:22:18.063979 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.063958 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 22:22:18.064077 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.063989 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-frwnb\"" Apr 16 22:22:18.064077 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.064054 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 22:22:18.072055 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.072035 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9"] Apr 16 22:22:18.174248 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.174223 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/7376575f-dab8-4278-8bd8-882f1ae8c7f7-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.174375 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.174269 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcr76\" (UniqueName: \"kubernetes.io/projected/7376575f-dab8-4278-8bd8-882f1ae8c7f7-kube-api-access-bcr76\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.174375 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.174304 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.174375 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.174327 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.174375 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.174350 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.174515 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.174401 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/7376575f-dab8-4278-8bd8-882f1ae8c7f7-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.174515 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.174435 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.275579 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.275542 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/7376575f-dab8-4278-8bd8-882f1ae8c7f7-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.275767 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.275596 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcr76\" (UniqueName: \"kubernetes.io/projected/7376575f-dab8-4278-8bd8-882f1ae8c7f7-kube-api-access-bcr76\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.275767 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.275656 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.275767 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.275682 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.275767 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.275721 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.275767 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.275765 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/7376575f-dab8-4278-8bd8-882f1ae8c7f7-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.276085 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.275795 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.276548 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.276518 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.278428 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.278403 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/7376575f-dab8-4278-8bd8-882f1ae8c7f7-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.278544 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.278430 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.278623 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.278565 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.278753 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.278734 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/7376575f-dab8-4278-8bd8-882f1ae8c7f7-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.284434 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.284415 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcr76\" (UniqueName: \"kubernetes.io/projected/7376575f-dab8-4278-8bd8-882f1ae8c7f7-kube-api-access-bcr76\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.284893 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.284874 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-l2gh9\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.370264 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.370191 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:18.508131 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.508101 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9"] Apr 16 22:22:18.511061 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:22:18.511031 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7376575f_dab8_4278_8bd8_882f1ae8c7f7.slice/crio-5579c9bc8a4d57adbbbfdbcbd513392c611afa69698e6412d34fb2a758f3a471 WatchSource:0}: Error finding container 5579c9bc8a4d57adbbbfdbcbd513392c611afa69698e6412d34fb2a758f3a471: Status 404 returned error can't find the container with id 5579c9bc8a4d57adbbbfdbcbd513392c611afa69698e6412d34fb2a758f3a471 Apr 16 22:22:18.849314 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:18.849276 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" event={"ID":"7376575f-dab8-4278-8bd8-882f1ae8c7f7","Type":"ContainerStarted","Data":"5579c9bc8a4d57adbbbfdbcbd513392c611afa69698e6412d34fb2a758f3a471"} Apr 16 22:22:24.180723 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:24.180677 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 22:22:24.180973 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:24.180754 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 22:22:24.872827 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:24.872786 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" event={"ID":"7376575f-dab8-4278-8bd8-882f1ae8c7f7","Type":"ContainerStarted","Data":"69e747db26a1eb57d594d33affb73b4559daa668a006bde12cb5a64cffd2860a"} Apr 16 22:22:24.873021 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:24.872847 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:24.893940 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:24.893895 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" podStartSLOduration=1.226584693 podStartE2EDuration="6.893881298s" podCreationTimestamp="2026-04-16 22:22:18 +0000 UTC" firstStartedPulling="2026-04-16 22:22:18.51315763 +0000 UTC m=+530.995676743" lastFinishedPulling="2026-04-16 22:22:24.180454225 +0000 UTC m=+536.662973348" observedRunningTime="2026-04-16 22:22:24.891984704 +0000 UTC m=+537.374503838" watchObservedRunningTime="2026-04-16 22:22:24.893881298 +0000 UTC m=+537.376400433" Apr 16 22:22:25.880360 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:25.880328 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:22:27.847467 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:27.847439 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-8c8zh" Apr 16 22:22:34.207057 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.207021 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v"] Apr 16 22:22:34.209736 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.209719 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" Apr 16 22:22:34.212090 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.212067 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 22:22:34.212090 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.212081 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rwfcd\"" Apr 16 22:22:34.212625 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.212598 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 22:22:34.218014 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.217993 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v"] Apr 16 22:22:34.305231 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.305195 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v\" (UID: \"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" Apr 16 22:22:34.305400 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.305267 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v\" (UID: \"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" Apr 16 22:22:34.305400 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.305334 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n766h\" (UniqueName: \"kubernetes.io/projected/b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80-kube-api-access-n766h\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v\" (UID: \"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" Apr 16 22:22:34.311674 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.311643 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs"] Apr 16 22:22:34.313935 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.313920 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs" Apr 16 22:22:34.322962 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.322938 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs"] Apr 16 22:22:34.404515 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.404490 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n"] Apr 16 22:22:34.406587 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.406563 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v\" (UID: \"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" Apr 16 22:22:34.406725 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.406594 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3198cd77-8af1-4fb4-94a2-0f44af2555e0-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs\" (UID: \"3198cd77-8af1-4fb4-94a2-0f44af2555e0\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs" Apr 16 22:22:34.406725 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.406647 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n766h\" (UniqueName: \"kubernetes.io/projected/b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80-kube-api-access-n766h\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v\" (UID: \"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" Apr 16 22:22:34.406725 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.406675 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3198cd77-8af1-4fb4-94a2-0f44af2555e0-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs\" (UID: \"3198cd77-8af1-4fb4-94a2-0f44af2555e0\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs" Apr 16 22:22:34.406725 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.406694 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v\" (UID: \"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" Apr 16 22:22:34.406942 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.406758 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7lbf\" (UniqueName: \"kubernetes.io/projected/3198cd77-8af1-4fb4-94a2-0f44af2555e0-kube-api-access-n7lbf\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs\" (UID: \"3198cd77-8af1-4fb4-94a2-0f44af2555e0\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs" Apr 16 22:22:34.406942 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.406824 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n" Apr 16 22:22:34.407029 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.406971 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v\" (UID: \"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" Apr 16 22:22:34.407029 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.407011 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v\" (UID: \"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" Apr 16 22:22:34.417694 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.417672 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n"] Apr 16 22:22:34.420200 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.420180 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n766h\" (UniqueName: \"kubernetes.io/projected/b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80-kube-api-access-n766h\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v\" (UID: \"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" Apr 16 22:22:34.518049 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.517970 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7lbf\" (UniqueName: \"kubernetes.io/projected/3198cd77-8af1-4fb4-94a2-0f44af2555e0-kube-api-access-n7lbf\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs\" (UID: \"3198cd77-8af1-4fb4-94a2-0f44af2555e0\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs" Apr 16 22:22:34.518271 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.518252 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3198cd77-8af1-4fb4-94a2-0f44af2555e0-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs\" (UID: \"3198cd77-8af1-4fb4-94a2-0f44af2555e0\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs" Apr 16 22:22:34.518427 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.518412 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3198cd77-8af1-4fb4-94a2-0f44af2555e0-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs\" (UID: \"3198cd77-8af1-4fb4-94a2-0f44af2555e0\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs" Apr 16 22:22:34.522878 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.518732 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68bed436-4972-4f81-a264-f60e90f09813-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n\" (UID: \"68bed436-4972-4f81-a264-f60e90f09813\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n" Apr 16 22:22:34.522878 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.518785 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68bed436-4972-4f81-a264-f60e90f09813-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n\" (UID: \"68bed436-4972-4f81-a264-f60e90f09813\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n" Apr 16 22:22:34.522878 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.518821 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zddzp\" (UniqueName: \"kubernetes.io/projected/68bed436-4972-4f81-a264-f60e90f09813-kube-api-access-zddzp\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n\" (UID: \"68bed436-4972-4f81-a264-f60e90f09813\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n" Apr 16 22:22:34.522878 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.518931 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3198cd77-8af1-4fb4-94a2-0f44af2555e0-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs\" (UID: \"3198cd77-8af1-4fb4-94a2-0f44af2555e0\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs" Apr 16 22:22:34.522878 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.519131 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3198cd77-8af1-4fb4-94a2-0f44af2555e0-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs\" (UID: \"3198cd77-8af1-4fb4-94a2-0f44af2555e0\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs" Apr 16 22:22:34.522878 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.519211 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" Apr 16 22:22:34.522878 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.520656 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd"] Apr 16 22:22:34.531404 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.529083 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd"] Apr 16 22:22:34.531404 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.529212 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" Apr 16 22:22:34.534262 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.534236 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7lbf\" (UniqueName: \"kubernetes.io/projected/3198cd77-8af1-4fb4-94a2-0f44af2555e0-kube-api-access-n7lbf\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs\" (UID: \"3198cd77-8af1-4fb4-94a2-0f44af2555e0\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs" Apr 16 22:22:34.619957 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.619924 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e50bbe3c-3ddf-4ef2-966b-0684adc804f8-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd\" (UID: \"e50bbe3c-3ddf-4ef2-966b-0684adc804f8\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" Apr 16 22:22:34.620074 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.619993 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e50bbe3c-3ddf-4ef2-966b-0684adc804f8-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd\" (UID: \"e50bbe3c-3ddf-4ef2-966b-0684adc804f8\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" Apr 16 22:22:34.620116 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.620091 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh7sx\" (UniqueName: \"kubernetes.io/projected/e50bbe3c-3ddf-4ef2-966b-0684adc804f8-kube-api-access-vh7sx\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd\" (UID: \"e50bbe3c-3ddf-4ef2-966b-0684adc804f8\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" Apr 16 22:22:34.620177 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.620160 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68bed436-4972-4f81-a264-f60e90f09813-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n\" (UID: \"68bed436-4972-4f81-a264-f60e90f09813\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n" Apr 16 22:22:34.620221 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.620203 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68bed436-4972-4f81-a264-f60e90f09813-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n\" (UID: \"68bed436-4972-4f81-a264-f60e90f09813\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n" Apr 16 22:22:34.620275 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.620242 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zddzp\" (UniqueName: \"kubernetes.io/projected/68bed436-4972-4f81-a264-f60e90f09813-kube-api-access-zddzp\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n\" (UID: \"68bed436-4972-4f81-a264-f60e90f09813\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n" Apr 16 22:22:34.620552 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.620532 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68bed436-4972-4f81-a264-f60e90f09813-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n\" (UID: \"68bed436-4972-4f81-a264-f60e90f09813\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n" Apr 16 22:22:34.620646 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.620563 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68bed436-4972-4f81-a264-f60e90f09813-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n\" (UID: \"68bed436-4972-4f81-a264-f60e90f09813\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n" Apr 16 22:22:34.623823 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.623750 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs" Apr 16 22:22:34.632038 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.632013 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zddzp\" (UniqueName: \"kubernetes.io/projected/68bed436-4972-4f81-a264-f60e90f09813-kube-api-access-zddzp\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n\" (UID: \"68bed436-4972-4f81-a264-f60e90f09813\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n" Apr 16 22:22:34.643852 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.643729 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v"] Apr 16 22:22:34.645504 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:22:34.645474 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2731be4_d8d2_41ff_a6fa_9f39d5b8dd80.slice/crio-e0a249b3da399eb46c8605caa3b9fa79b358e30b0e431db7f5a34a71480e1cf9 WatchSource:0}: Error finding container e0a249b3da399eb46c8605caa3b9fa79b358e30b0e431db7f5a34a71480e1cf9: Status 404 returned error can't find the container with id e0a249b3da399eb46c8605caa3b9fa79b358e30b0e431db7f5a34a71480e1cf9 Apr 16 22:22:34.714569 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.714540 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n" Apr 16 22:22:34.720908 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.720878 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vh7sx\" (UniqueName: \"kubernetes.io/projected/e50bbe3c-3ddf-4ef2-966b-0684adc804f8-kube-api-access-vh7sx\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd\" (UID: \"e50bbe3c-3ddf-4ef2-966b-0684adc804f8\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" Apr 16 22:22:34.721151 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.721012 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e50bbe3c-3ddf-4ef2-966b-0684adc804f8-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd\" (UID: \"e50bbe3c-3ddf-4ef2-966b-0684adc804f8\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" Apr 16 22:22:34.721151 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.721062 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e50bbe3c-3ddf-4ef2-966b-0684adc804f8-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd\" (UID: \"e50bbe3c-3ddf-4ef2-966b-0684adc804f8\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" Apr 16 22:22:34.724631 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.721717 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e50bbe3c-3ddf-4ef2-966b-0684adc804f8-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd\" (UID: \"e50bbe3c-3ddf-4ef2-966b-0684adc804f8\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" Apr 16 22:22:34.724631 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.721814 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e50bbe3c-3ddf-4ef2-966b-0684adc804f8-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd\" (UID: \"e50bbe3c-3ddf-4ef2-966b-0684adc804f8\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" Apr 16 22:22:34.730809 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.730781 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh7sx\" (UniqueName: \"kubernetes.io/projected/e50bbe3c-3ddf-4ef2-966b-0684adc804f8-kube-api-access-vh7sx\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd\" (UID: \"e50bbe3c-3ddf-4ef2-966b-0684adc804f8\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" Apr 16 22:22:34.746727 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.746700 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs"] Apr 16 22:22:34.748443 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:22:34.748412 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3198cd77_8af1_4fb4_94a2_0f44af2555e0.slice/crio-7f3922a1fb298ff38d73db2851b809aec113a530d62df260cbf66f9e15682c94 WatchSource:0}: Error finding container 7f3922a1fb298ff38d73db2851b809aec113a530d62df260cbf66f9e15682c94: Status 404 returned error can't find the container with id 7f3922a1fb298ff38d73db2851b809aec113a530d62df260cbf66f9e15682c94 Apr 16 22:22:34.841950 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.841925 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n"] Apr 16 22:22:34.843761 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:22:34.843735 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68bed436_4972_4f81_a264_f60e90f09813.slice/crio-4e33ae4367f8580570dbf070aeebd110d5f5bbaedd574f4915095c0c8d0967bd WatchSource:0}: Error finding container 4e33ae4367f8580570dbf070aeebd110d5f5bbaedd574f4915095c0c8d0967bd: Status 404 returned error can't find the container with id 4e33ae4367f8580570dbf070aeebd110d5f5bbaedd574f4915095c0c8d0967bd Apr 16 22:22:34.854446 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.854424 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" Apr 16 22:22:34.911419 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.911393 2562 generic.go:358] "Generic (PLEG): container finished" podID="3198cd77-8af1-4fb4-94a2-0f44af2555e0" containerID="16a427271e37b226b6b9cf74177f5deda246184e33c12e54e0511f6362d47875" exitCode=0 Apr 16 22:22:34.911522 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.911470 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs" event={"ID":"3198cd77-8af1-4fb4-94a2-0f44af2555e0","Type":"ContainerDied","Data":"16a427271e37b226b6b9cf74177f5deda246184e33c12e54e0511f6362d47875"} Apr 16 22:22:34.911522 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.911492 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs" event={"ID":"3198cd77-8af1-4fb4-94a2-0f44af2555e0","Type":"ContainerStarted","Data":"7f3922a1fb298ff38d73db2851b809aec113a530d62df260cbf66f9e15682c94"} Apr 16 22:22:34.913268 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.913134 2562 generic.go:358] "Generic (PLEG): container finished" podID="b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80" containerID="c466abc4063af8269a33ad9948faeb8acbf15b76641d820633a79b38be89411f" exitCode=0 Apr 16 22:22:34.913268 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.913186 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" event={"ID":"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80","Type":"ContainerDied","Data":"c466abc4063af8269a33ad9948faeb8acbf15b76641d820633a79b38be89411f"} Apr 16 22:22:34.913268 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.913255 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" event={"ID":"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80","Type":"ContainerStarted","Data":"e0a249b3da399eb46c8605caa3b9fa79b358e30b0e431db7f5a34a71480e1cf9"} Apr 16 22:22:34.914333 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.914303 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n" event={"ID":"68bed436-4972-4f81-a264-f60e90f09813","Type":"ContainerStarted","Data":"4e33ae4367f8580570dbf070aeebd110d5f5bbaedd574f4915095c0c8d0967bd"} Apr 16 22:22:34.981810 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:34.981782 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd"] Apr 16 22:22:34.984313 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:22:34.984280 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode50bbe3c_3ddf_4ef2_966b_0684adc804f8.slice/crio-62238d6d8a61195ea33430a4ecdedaf15c3ebf4d72ae6e53dfbf0beef87f2608 WatchSource:0}: Error finding container 62238d6d8a61195ea33430a4ecdedaf15c3ebf4d72ae6e53dfbf0beef87f2608: Status 404 returned error can't find the container with id 62238d6d8a61195ea33430a4ecdedaf15c3ebf4d72ae6e53dfbf0beef87f2608 Apr 16 22:22:35.920336 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:35.920309 2562 generic.go:358] "Generic (PLEG): container finished" podID="68bed436-4972-4f81-a264-f60e90f09813" containerID="69bfa1cb8cba7f02237d4e87c399aa5677914336bd87e389eea665bcfcc8d63d" exitCode=0 Apr 16 22:22:35.920696 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:35.920392 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n" event={"ID":"68bed436-4972-4f81-a264-f60e90f09813","Type":"ContainerDied","Data":"69bfa1cb8cba7f02237d4e87c399aa5677914336bd87e389eea665bcfcc8d63d"} Apr 16 22:22:35.922172 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:35.922154 2562 generic.go:358] "Generic (PLEG): container finished" podID="3198cd77-8af1-4fb4-94a2-0f44af2555e0" containerID="57aeaa6426c2b4fcc2c20a0c8d1ba6813c128203cd28e4f448b6372923607ffd" exitCode=0 Apr 16 22:22:35.922262 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:35.922215 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs" event={"ID":"3198cd77-8af1-4fb4-94a2-0f44af2555e0","Type":"ContainerDied","Data":"57aeaa6426c2b4fcc2c20a0c8d1ba6813c128203cd28e4f448b6372923607ffd"} Apr 16 22:22:35.923749 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:35.923727 2562 generic.go:358] "Generic (PLEG): container finished" podID="e50bbe3c-3ddf-4ef2-966b-0684adc804f8" containerID="32b2363d565095ac3d7dce35c58cb55bc62b49bca0011e021b6df4a3d46ae7c1" exitCode=0 Apr 16 22:22:35.923861 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:35.923810 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" event={"ID":"e50bbe3c-3ddf-4ef2-966b-0684adc804f8","Type":"ContainerDied","Data":"32b2363d565095ac3d7dce35c58cb55bc62b49bca0011e021b6df4a3d46ae7c1"} Apr 16 22:22:35.923861 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:35.923831 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" event={"ID":"e50bbe3c-3ddf-4ef2-966b-0684adc804f8","Type":"ContainerStarted","Data":"62238d6d8a61195ea33430a4ecdedaf15c3ebf4d72ae6e53dfbf0beef87f2608"} Apr 16 22:22:35.926053 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:35.926029 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" event={"ID":"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80","Type":"ContainerStarted","Data":"b53b62b1b19e7eb65c8820ecc0daff479e97bacddbc8f281d14f6c2c792be26e"} Apr 16 22:22:36.932662 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:36.932636 2562 generic.go:358] "Generic (PLEG): container finished" podID="3198cd77-8af1-4fb4-94a2-0f44af2555e0" containerID="e9c975b1b19256546e26e866eeec31cbd6a300045b7ebbe73c8662cb5ef3497a" exitCode=0 Apr 16 22:22:36.932995 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:36.932706 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs" event={"ID":"3198cd77-8af1-4fb4-94a2-0f44af2555e0","Type":"ContainerDied","Data":"e9c975b1b19256546e26e866eeec31cbd6a300045b7ebbe73c8662cb5ef3497a"} Apr 16 22:22:36.934451 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:36.934430 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" event={"ID":"e50bbe3c-3ddf-4ef2-966b-0684adc804f8","Type":"ContainerStarted","Data":"497b06be02c3ad9c6900f131eabbd9a3b3ac3ffd33c6a79e12177fde560ccb8c"} Apr 16 22:22:36.936291 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:36.936167 2562 generic.go:358] "Generic (PLEG): container finished" podID="b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80" containerID="b53b62b1b19e7eb65c8820ecc0daff479e97bacddbc8f281d14f6c2c792be26e" exitCode=0 Apr 16 22:22:36.936291 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:36.936255 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" event={"ID":"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80","Type":"ContainerDied","Data":"b53b62b1b19e7eb65c8820ecc0daff479e97bacddbc8f281d14f6c2c792be26e"} Apr 16 22:22:36.938333 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:36.938316 2562 generic.go:358] "Generic (PLEG): container finished" podID="68bed436-4972-4f81-a264-f60e90f09813" containerID="48cad2edb4779351755e94f0fd2c3e1b97158350dcbb0f8c8d842ca1d1761cd9" exitCode=0 Apr 16 22:22:36.938390 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:36.938347 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n" event={"ID":"68bed436-4972-4f81-a264-f60e90f09813","Type":"ContainerDied","Data":"48cad2edb4779351755e94f0fd2c3e1b97158350dcbb0f8c8d842ca1d1761cd9"} Apr 16 22:22:37.944243 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:37.944200 2562 generic.go:358] "Generic (PLEG): container finished" podID="e50bbe3c-3ddf-4ef2-966b-0684adc804f8" containerID="497b06be02c3ad9c6900f131eabbd9a3b3ac3ffd33c6a79e12177fde560ccb8c" exitCode=0 Apr 16 22:22:37.944783 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:37.944292 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" event={"ID":"e50bbe3c-3ddf-4ef2-966b-0684adc804f8","Type":"ContainerDied","Data":"497b06be02c3ad9c6900f131eabbd9a3b3ac3ffd33c6a79e12177fde560ccb8c"} Apr 16 22:22:37.946506 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:37.946485 2562 generic.go:358] "Generic (PLEG): container finished" podID="b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80" containerID="64f21f7e65e953d67df7cf3e023ce6f0bd51a640d1f42e4d5514bb794d836ac3" exitCode=0 Apr 16 22:22:37.946629 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:37.946572 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" event={"ID":"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80","Type":"ContainerDied","Data":"64f21f7e65e953d67df7cf3e023ce6f0bd51a640d1f42e4d5514bb794d836ac3"} Apr 16 22:22:37.948650 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:37.948592 2562 generic.go:358] "Generic (PLEG): container finished" podID="68bed436-4972-4f81-a264-f60e90f09813" containerID="8fb04fbaea344c3ca57a8b73debf3a3e053b8fa778b0528833f614824521102d" exitCode=0 Apr 16 22:22:37.948650 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:37.948636 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n" event={"ID":"68bed436-4972-4f81-a264-f60e90f09813","Type":"ContainerDied","Data":"8fb04fbaea344c3ca57a8b73debf3a3e053b8fa778b0528833f614824521102d"} Apr 16 22:22:38.086770 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:38.086740 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs" Apr 16 22:22:38.248464 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:38.248367 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3198cd77-8af1-4fb4-94a2-0f44af2555e0-bundle\") pod \"3198cd77-8af1-4fb4-94a2-0f44af2555e0\" (UID: \"3198cd77-8af1-4fb4-94a2-0f44af2555e0\") " Apr 16 22:22:38.248464 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:38.248430 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7lbf\" (UniqueName: \"kubernetes.io/projected/3198cd77-8af1-4fb4-94a2-0f44af2555e0-kube-api-access-n7lbf\") pod \"3198cd77-8af1-4fb4-94a2-0f44af2555e0\" (UID: \"3198cd77-8af1-4fb4-94a2-0f44af2555e0\") " Apr 16 22:22:38.248729 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:38.248471 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3198cd77-8af1-4fb4-94a2-0f44af2555e0-util\") pod \"3198cd77-8af1-4fb4-94a2-0f44af2555e0\" (UID: \"3198cd77-8af1-4fb4-94a2-0f44af2555e0\") " Apr 16 22:22:38.248870 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:38.248842 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3198cd77-8af1-4fb4-94a2-0f44af2555e0-bundle" (OuterVolumeSpecName: "bundle") pod "3198cd77-8af1-4fb4-94a2-0f44af2555e0" (UID: "3198cd77-8af1-4fb4-94a2-0f44af2555e0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:22:38.250549 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:38.250525 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3198cd77-8af1-4fb4-94a2-0f44af2555e0-kube-api-access-n7lbf" (OuterVolumeSpecName: "kube-api-access-n7lbf") pod "3198cd77-8af1-4fb4-94a2-0f44af2555e0" (UID: "3198cd77-8af1-4fb4-94a2-0f44af2555e0"). InnerVolumeSpecName "kube-api-access-n7lbf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:22:38.254063 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:38.254039 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3198cd77-8af1-4fb4-94a2-0f44af2555e0-util" (OuterVolumeSpecName: "util") pod "3198cd77-8af1-4fb4-94a2-0f44af2555e0" (UID: "3198cd77-8af1-4fb4-94a2-0f44af2555e0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:22:38.349776 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:38.349738 2562 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3198cd77-8af1-4fb4-94a2-0f44af2555e0-bundle\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:22:38.349776 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:38.349770 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n7lbf\" (UniqueName: \"kubernetes.io/projected/3198cd77-8af1-4fb4-94a2-0f44af2555e0-kube-api-access-n7lbf\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:22:38.349776 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:38.349780 2562 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3198cd77-8af1-4fb4-94a2-0f44af2555e0-util\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:22:38.954174 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:38.954139 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs" event={"ID":"3198cd77-8af1-4fb4-94a2-0f44af2555e0","Type":"ContainerDied","Data":"7f3922a1fb298ff38d73db2851b809aec113a530d62df260cbf66f9e15682c94"} Apr 16 22:22:38.954586 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:38.954182 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f3922a1fb298ff38d73db2851b809aec113a530d62df260cbf66f9e15682c94" Apr 16 22:22:38.954586 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:38.954156 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bnnbcs" Apr 16 22:22:38.958998 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:38.958961 2562 generic.go:358] "Generic (PLEG): container finished" podID="e50bbe3c-3ddf-4ef2-966b-0684adc804f8" containerID="d7e59b44cd54da1d59a8402233b5763cf7b67a0b85938764b9aa869613f04adf" exitCode=0 Apr 16 22:22:38.959146 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:38.959000 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" event={"ID":"e50bbe3c-3ddf-4ef2-966b-0684adc804f8","Type":"ContainerDied","Data":"d7e59b44cd54da1d59a8402233b5763cf7b67a0b85938764b9aa869613f04adf"} Apr 16 22:22:39.087593 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.087571 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n" Apr 16 22:22:39.118742 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.118717 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" Apr 16 22:22:39.256560 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.256459 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n766h\" (UniqueName: \"kubernetes.io/projected/b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80-kube-api-access-n766h\") pod \"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80\" (UID: \"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80\") " Apr 16 22:22:39.256560 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.256518 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80-util\") pod \"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80\" (UID: \"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80\") " Apr 16 22:22:39.256560 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.256541 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zddzp\" (UniqueName: \"kubernetes.io/projected/68bed436-4972-4f81-a264-f60e90f09813-kube-api-access-zddzp\") pod \"68bed436-4972-4f81-a264-f60e90f09813\" (UID: \"68bed436-4972-4f81-a264-f60e90f09813\") " Apr 16 22:22:39.256879 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.256579 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80-bundle\") pod \"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80\" (UID: \"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80\") " Apr 16 22:22:39.256879 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.256658 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68bed436-4972-4f81-a264-f60e90f09813-bundle\") pod \"68bed436-4972-4f81-a264-f60e90f09813\" (UID: \"68bed436-4972-4f81-a264-f60e90f09813\") " Apr 16 22:22:39.256879 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.256709 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68bed436-4972-4f81-a264-f60e90f09813-util\") pod \"68bed436-4972-4f81-a264-f60e90f09813\" (UID: \"68bed436-4972-4f81-a264-f60e90f09813\") " Apr 16 22:22:39.257212 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.257182 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80-bundle" (OuterVolumeSpecName: "bundle") pod "b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80" (UID: "b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:22:39.257417 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.257385 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68bed436-4972-4f81-a264-f60e90f09813-bundle" (OuterVolumeSpecName: "bundle") pod "68bed436-4972-4f81-a264-f60e90f09813" (UID: "68bed436-4972-4f81-a264-f60e90f09813"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:22:39.258804 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.258777 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68bed436-4972-4f81-a264-f60e90f09813-kube-api-access-zddzp" (OuterVolumeSpecName: "kube-api-access-zddzp") pod "68bed436-4972-4f81-a264-f60e90f09813" (UID: "68bed436-4972-4f81-a264-f60e90f09813"). InnerVolumeSpecName "kube-api-access-zddzp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:22:39.258897 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.258821 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80-kube-api-access-n766h" (OuterVolumeSpecName: "kube-api-access-n766h") pod "b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80" (UID: "b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80"). InnerVolumeSpecName "kube-api-access-n766h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:22:39.262510 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.262486 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68bed436-4972-4f81-a264-f60e90f09813-util" (OuterVolumeSpecName: "util") pod "68bed436-4972-4f81-a264-f60e90f09813" (UID: "68bed436-4972-4f81-a264-f60e90f09813"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:22:39.262671 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.262653 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80-util" (OuterVolumeSpecName: "util") pod "b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80" (UID: "b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:22:39.358326 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.358288 2562 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68bed436-4972-4f81-a264-f60e90f09813-util\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:22:39.358326 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.358320 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n766h\" (UniqueName: \"kubernetes.io/projected/b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80-kube-api-access-n766h\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:22:39.358326 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.358331 2562 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80-util\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:22:39.358552 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.358341 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zddzp\" (UniqueName: \"kubernetes.io/projected/68bed436-4972-4f81-a264-f60e90f09813-kube-api-access-zddzp\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:22:39.358552 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.358350 2562 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80-bundle\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:22:39.358552 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.358359 2562 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68bed436-4972-4f81-a264-f60e90f09813-bundle\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:22:39.964627 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.964584 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" Apr 16 22:22:39.964627 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.964589 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec88fzb9v" event={"ID":"b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80","Type":"ContainerDied","Data":"e0a249b3da399eb46c8605caa3b9fa79b358e30b0e431db7f5a34a71480e1cf9"} Apr 16 22:22:39.965146 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.964638 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0a249b3da399eb46c8605caa3b9fa79b358e30b0e431db7f5a34a71480e1cf9" Apr 16 22:22:39.966294 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.966268 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n" Apr 16 22:22:39.966294 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.966284 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30sg69n" event={"ID":"68bed436-4972-4f81-a264-f60e90f09813","Type":"ContainerDied","Data":"4e33ae4367f8580570dbf070aeebd110d5f5bbaedd574f4915095c0c8d0967bd"} Apr 16 22:22:39.966492 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:39.966304 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e33ae4367f8580570dbf070aeebd110d5f5bbaedd574f4915095c0c8d0967bd" Apr 16 22:22:40.083185 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:40.083162 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" Apr 16 22:22:40.267036 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:40.266947 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e50bbe3c-3ddf-4ef2-966b-0684adc804f8-bundle\") pod \"e50bbe3c-3ddf-4ef2-966b-0684adc804f8\" (UID: \"e50bbe3c-3ddf-4ef2-966b-0684adc804f8\") " Apr 16 22:22:40.267036 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:40.267009 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e50bbe3c-3ddf-4ef2-966b-0684adc804f8-util\") pod \"e50bbe3c-3ddf-4ef2-966b-0684adc804f8\" (UID: \"e50bbe3c-3ddf-4ef2-966b-0684adc804f8\") " Apr 16 22:22:40.267036 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:40.267035 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh7sx\" (UniqueName: \"kubernetes.io/projected/e50bbe3c-3ddf-4ef2-966b-0684adc804f8-kube-api-access-vh7sx\") pod \"e50bbe3c-3ddf-4ef2-966b-0684adc804f8\" (UID: \"e50bbe3c-3ddf-4ef2-966b-0684adc804f8\") " Apr 16 22:22:40.267743 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:40.267711 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e50bbe3c-3ddf-4ef2-966b-0684adc804f8-bundle" (OuterVolumeSpecName: "bundle") pod "e50bbe3c-3ddf-4ef2-966b-0684adc804f8" (UID: "e50bbe3c-3ddf-4ef2-966b-0684adc804f8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:22:40.269206 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:40.269187 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e50bbe3c-3ddf-4ef2-966b-0684adc804f8-kube-api-access-vh7sx" (OuterVolumeSpecName: "kube-api-access-vh7sx") pod "e50bbe3c-3ddf-4ef2-966b-0684adc804f8" (UID: "e50bbe3c-3ddf-4ef2-966b-0684adc804f8"). InnerVolumeSpecName "kube-api-access-vh7sx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:22:40.272212 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:40.272189 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e50bbe3c-3ddf-4ef2-966b-0684adc804f8-util" (OuterVolumeSpecName: "util") pod "e50bbe3c-3ddf-4ef2-966b-0684adc804f8" (UID: "e50bbe3c-3ddf-4ef2-966b-0684adc804f8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:22:40.367636 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:40.367563 2562 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e50bbe3c-3ddf-4ef2-966b-0684adc804f8-bundle\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:22:40.367636 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:40.367630 2562 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e50bbe3c-3ddf-4ef2-966b-0684adc804f8-util\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:22:40.367636 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:40.367645 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vh7sx\" (UniqueName: \"kubernetes.io/projected/e50bbe3c-3ddf-4ef2-966b-0684adc804f8-kube-api-access-vh7sx\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:22:40.971127 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:40.971100 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" Apr 16 22:22:40.971499 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:40.971095 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503lw2sd" event={"ID":"e50bbe3c-3ddf-4ef2-966b-0684adc804f8","Type":"ContainerDied","Data":"62238d6d8a61195ea33430a4ecdedaf15c3ebf4d72ae6e53dfbf0beef87f2608"} Apr 16 22:22:40.971499 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:40.971206 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62238d6d8a61195ea33430a4ecdedaf15c3ebf4d72ae6e53dfbf0beef87f2608" Apr 16 22:22:45.219486 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219449 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wrrnt"] Apr 16 22:22:45.219877 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219770 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68bed436-4972-4f81-a264-f60e90f09813" containerName="pull" Apr 16 22:22:45.219877 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219784 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="68bed436-4972-4f81-a264-f60e90f09813" containerName="pull" Apr 16 22:22:45.219877 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219793 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e50bbe3c-3ddf-4ef2-966b-0684adc804f8" containerName="util" Apr 16 22:22:45.219877 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219799 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e50bbe3c-3ddf-4ef2-966b-0684adc804f8" containerName="util" Apr 16 22:22:45.219877 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219805 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3198cd77-8af1-4fb4-94a2-0f44af2555e0" containerName="extract" Apr 16 22:22:45.219877 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219813 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="3198cd77-8af1-4fb4-94a2-0f44af2555e0" containerName="extract" Apr 16 22:22:45.219877 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219820 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e50bbe3c-3ddf-4ef2-966b-0684adc804f8" containerName="extract" Apr 16 22:22:45.219877 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219827 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e50bbe3c-3ddf-4ef2-966b-0684adc804f8" containerName="extract" Apr 16 22:22:45.219877 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219834 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80" containerName="util" Apr 16 22:22:45.219877 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219839 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80" containerName="util" Apr 16 22:22:45.219877 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219848 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68bed436-4972-4f81-a264-f60e90f09813" containerName="util" Apr 16 22:22:45.219877 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219853 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="68bed436-4972-4f81-a264-f60e90f09813" containerName="util" Apr 16 22:22:45.219877 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219861 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80" containerName="extract" Apr 16 22:22:45.219877 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219866 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80" containerName="extract" Apr 16 22:22:45.219877 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219874 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80" containerName="pull" Apr 16 22:22:45.219877 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219879 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80" containerName="pull" Apr 16 22:22:45.220424 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219888 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3198cd77-8af1-4fb4-94a2-0f44af2555e0" containerName="pull" Apr 16 22:22:45.220424 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219893 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="3198cd77-8af1-4fb4-94a2-0f44af2555e0" containerName="pull" Apr 16 22:22:45.220424 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219900 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e50bbe3c-3ddf-4ef2-966b-0684adc804f8" containerName="pull" Apr 16 22:22:45.220424 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219905 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e50bbe3c-3ddf-4ef2-966b-0684adc804f8" containerName="pull" Apr 16 22:22:45.220424 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219910 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68bed436-4972-4f81-a264-f60e90f09813" containerName="extract" Apr 16 22:22:45.220424 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219915 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="68bed436-4972-4f81-a264-f60e90f09813" containerName="extract" Apr 16 22:22:45.220424 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219922 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3198cd77-8af1-4fb4-94a2-0f44af2555e0" containerName="util" Apr 16 22:22:45.220424 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219926 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="3198cd77-8af1-4fb4-94a2-0f44af2555e0" containerName="util" Apr 16 22:22:45.220424 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219973 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="3198cd77-8af1-4fb4-94a2-0f44af2555e0" containerName="extract" Apr 16 22:22:45.220424 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219980 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="e50bbe3c-3ddf-4ef2-966b-0684adc804f8" containerName="extract" Apr 16 22:22:45.220424 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219989 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="b2731be4-d8d2-41ff-a6fa-9f39d5b8dd80" containerName="extract" Apr 16 22:22:45.220424 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.219995 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="68bed436-4972-4f81-a264-f60e90f09813" containerName="extract" Apr 16 22:22:45.224791 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.224772 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wrrnt" Apr 16 22:22:45.227199 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.227174 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 22:22:45.227305 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.227182 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 22:22:45.227786 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.227769 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-99ggf\"" Apr 16 22:22:45.231374 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.231354 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wrrnt"] Apr 16 22:22:45.409824 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.409780 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwj4v\" (UniqueName: \"kubernetes.io/projected/9c333486-f7ee-42fc-8009-f0ee787db97a-kube-api-access-kwj4v\") pod \"limitador-operator-controller-manager-c7fb4c8d5-wrrnt\" (UID: \"9c333486-f7ee-42fc-8009-f0ee787db97a\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wrrnt" Apr 16 22:22:45.510368 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.510280 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwj4v\" (UniqueName: \"kubernetes.io/projected/9c333486-f7ee-42fc-8009-f0ee787db97a-kube-api-access-kwj4v\") pod \"limitador-operator-controller-manager-c7fb4c8d5-wrrnt\" (UID: \"9c333486-f7ee-42fc-8009-f0ee787db97a\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wrrnt" Apr 16 22:22:45.536778 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.536752 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwj4v\" (UniqueName: \"kubernetes.io/projected/9c333486-f7ee-42fc-8009-f0ee787db97a-kube-api-access-kwj4v\") pod \"limitador-operator-controller-manager-c7fb4c8d5-wrrnt\" (UID: \"9c333486-f7ee-42fc-8009-f0ee787db97a\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wrrnt" Apr 16 22:22:45.835767 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.835727 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wrrnt" Apr 16 22:22:45.965734 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.965711 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wrrnt"] Apr 16 22:22:45.968243 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:22:45.968214 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c333486_f7ee_42fc_8009_f0ee787db97a.slice/crio-e9636c85303d220fe18d9e4a3dd61a46abaeab24147e10b37c43b8672b027627 WatchSource:0}: Error finding container e9636c85303d220fe18d9e4a3dd61a46abaeab24147e10b37c43b8672b027627: Status 404 returned error can't find the container with id e9636c85303d220fe18d9e4a3dd61a46abaeab24147e10b37c43b8672b027627 Apr 16 22:22:45.993127 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:45.993102 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wrrnt" event={"ID":"9c333486-f7ee-42fc-8009-f0ee787db97a","Type":"ContainerStarted","Data":"e9636c85303d220fe18d9e4a3dd61a46abaeab24147e10b37c43b8672b027627"} Apr 16 22:22:48.526308 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:48.526274 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-88ww9"] Apr 16 22:22:48.530395 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:48.530374 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-88ww9" Apr 16 22:22:48.532812 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:48.532783 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-t8jv6\"" Apr 16 22:22:48.540301 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:48.539887 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb92r\" (UniqueName: \"kubernetes.io/projected/34a1ee5a-41c2-4f38-a715-36a1e3816faf-kube-api-access-kb92r\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-88ww9\" (UID: \"34a1ee5a-41c2-4f38-a715-36a1e3816faf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-88ww9" Apr 16 22:22:48.540301 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:48.540175 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/34a1ee5a-41c2-4f38-a715-36a1e3816faf-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-88ww9\" (UID: \"34a1ee5a-41c2-4f38-a715-36a1e3816faf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-88ww9" Apr 16 22:22:48.542738 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:48.542707 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-88ww9"] Apr 16 22:22:48.640954 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:48.640885 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/34a1ee5a-41c2-4f38-a715-36a1e3816faf-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-88ww9\" (UID: \"34a1ee5a-41c2-4f38-a715-36a1e3816faf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-88ww9" Apr 16 22:22:48.640954 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:48.640936 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb92r\" (UniqueName: \"kubernetes.io/projected/34a1ee5a-41c2-4f38-a715-36a1e3816faf-kube-api-access-kb92r\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-88ww9\" (UID: \"34a1ee5a-41c2-4f38-a715-36a1e3816faf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-88ww9" Apr 16 22:22:48.641332 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:48.641315 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/34a1ee5a-41c2-4f38-a715-36a1e3816faf-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-88ww9\" (UID: \"34a1ee5a-41c2-4f38-a715-36a1e3816faf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-88ww9" Apr 16 22:22:48.652258 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:48.652228 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb92r\" (UniqueName: \"kubernetes.io/projected/34a1ee5a-41c2-4f38-a715-36a1e3816faf-kube-api-access-kb92r\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-88ww9\" (UID: \"34a1ee5a-41c2-4f38-a715-36a1e3816faf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-88ww9" Apr 16 22:22:48.842663 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:48.842622 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-88ww9" Apr 16 22:22:48.983440 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:48.983406 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-88ww9"] Apr 16 22:22:48.985760 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:22:48.985730 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34a1ee5a_41c2_4f38_a715_36a1e3816faf.slice/crio-85f35bef5f1f9db803339cf429d99d3dc7314f71a62abf79812d8b606f616cf3 WatchSource:0}: Error finding container 85f35bef5f1f9db803339cf429d99d3dc7314f71a62abf79812d8b606f616cf3: Status 404 returned error can't find the container with id 85f35bef5f1f9db803339cf429d99d3dc7314f71a62abf79812d8b606f616cf3 Apr 16 22:22:49.008367 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:49.008332 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wrrnt" event={"ID":"9c333486-f7ee-42fc-8009-f0ee787db97a","Type":"ContainerStarted","Data":"bbf31763e3ea7aacb591698e53a694ce7631c89267db2204bf022e5415dfd8cf"} Apr 16 22:22:49.008519 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:49.008500 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wrrnt" Apr 16 22:22:49.009559 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:49.009535 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-88ww9" event={"ID":"34a1ee5a-41c2-4f38-a715-36a1e3816faf","Type":"ContainerStarted","Data":"85f35bef5f1f9db803339cf429d99d3dc7314f71a62abf79812d8b606f616cf3"} Apr 16 22:22:53.059980 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:53.059910 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wrrnt" podStartSLOduration=5.66993219 podStartE2EDuration="8.059886708s" podCreationTimestamp="2026-04-16 22:22:45 +0000 UTC" firstStartedPulling="2026-04-16 22:22:45.970213608 +0000 UTC m=+558.452732721" lastFinishedPulling="2026-04-16 22:22:48.360168112 +0000 UTC m=+560.842687239" observedRunningTime="2026-04-16 22:22:49.050286238 +0000 UTC m=+561.532805373" watchObservedRunningTime="2026-04-16 22:22:53.059886708 +0000 UTC m=+565.542405845" Apr 16 22:22:53.062266 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:53.062235 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bc597bf69-zxj7c"] Apr 16 22:22:53.438120 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:53.438040 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-22b7x"] Apr 16 22:22:53.441473 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:53.441457 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-22b7x" Apr 16 22:22:53.447183 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:53.447160 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-lllwz\"" Apr 16 22:22:53.451572 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:53.451553 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-22b7x"] Apr 16 22:22:53.480857 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:53.480834 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psnw7\" (UniqueName: \"kubernetes.io/projected/471e9843-40f5-4d9c-a5ca-656553b34579-kube-api-access-psnw7\") pod \"authorino-operator-7587b89b76-22b7x\" (UID: \"471e9843-40f5-4d9c-a5ca-656553b34579\") " pod="kuadrant-system/authorino-operator-7587b89b76-22b7x" Apr 16 22:22:53.581273 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:53.581240 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psnw7\" (UniqueName: \"kubernetes.io/projected/471e9843-40f5-4d9c-a5ca-656553b34579-kube-api-access-psnw7\") pod \"authorino-operator-7587b89b76-22b7x\" (UID: \"471e9843-40f5-4d9c-a5ca-656553b34579\") " pod="kuadrant-system/authorino-operator-7587b89b76-22b7x" Apr 16 22:22:53.593030 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:53.593002 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psnw7\" (UniqueName: \"kubernetes.io/projected/471e9843-40f5-4d9c-a5ca-656553b34579-kube-api-access-psnw7\") pod \"authorino-operator-7587b89b76-22b7x\" (UID: \"471e9843-40f5-4d9c-a5ca-656553b34579\") " pod="kuadrant-system/authorino-operator-7587b89b76-22b7x" Apr 16 22:22:53.753799 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:53.753713 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-22b7x" Apr 16 22:22:55.032804 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:55.032780 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-22b7x"] Apr 16 22:22:55.035197 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:22:55.035169 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod471e9843_40f5_4d9c_a5ca_656553b34579.slice/crio-b0a5167b6074c8bf068c69bdd624c66282ad49026a80fe42cf24e65df854bdc6 WatchSource:0}: Error finding container b0a5167b6074c8bf068c69bdd624c66282ad49026a80fe42cf24e65df854bdc6: Status 404 returned error can't find the container with id b0a5167b6074c8bf068c69bdd624c66282ad49026a80fe42cf24e65df854bdc6 Apr 16 22:22:55.036400 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:55.036348 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-88ww9" event={"ID":"34a1ee5a-41c2-4f38-a715-36a1e3816faf","Type":"ContainerStarted","Data":"1dcae9027d42fe6ff19e718290fe6d36fce25ea075c95a8fc0850a41e32e240d"} Apr 16 22:22:55.036520 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:55.036471 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-88ww9" Apr 16 22:22:55.063751 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:55.063703 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-88ww9" podStartSLOduration=1.100215425 podStartE2EDuration="7.063689907s" podCreationTimestamp="2026-04-16 22:22:48 +0000 UTC" firstStartedPulling="2026-04-16 22:22:48.989037538 +0000 UTC m=+561.471556651" lastFinishedPulling="2026-04-16 22:22:54.952512009 +0000 UTC m=+567.435031133" observedRunningTime="2026-04-16 22:22:55.060648374 +0000 UTC m=+567.543167508" watchObservedRunningTime="2026-04-16 22:22:55.063689907 +0000 UTC m=+567.546209042" Apr 16 22:22:56.042194 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:56.042159 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-22b7x" event={"ID":"471e9843-40f5-4d9c-a5ca-656553b34579","Type":"ContainerStarted","Data":"b0a5167b6074c8bf068c69bdd624c66282ad49026a80fe42cf24e65df854bdc6"} Apr 16 22:22:57.047749 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:57.047715 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-22b7x" event={"ID":"471e9843-40f5-4d9c-a5ca-656553b34579","Type":"ContainerStarted","Data":"f407977926beffb2d2545142ce1eb4b612b9192fe15d0df364b4b961f6f9a456"} Apr 16 22:22:57.048176 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:57.047826 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-22b7x" Apr 16 22:22:57.064645 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:22:57.064583 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-22b7x" podStartSLOduration=2.840788866 podStartE2EDuration="4.06457222s" podCreationTimestamp="2026-04-16 22:22:53 +0000 UTC" firstStartedPulling="2026-04-16 22:22:55.037738299 +0000 UTC m=+567.520257430" lastFinishedPulling="2026-04-16 22:22:56.261521662 +0000 UTC m=+568.744040784" observedRunningTime="2026-04-16 22:22:57.063253869 +0000 UTC m=+569.545773014" watchObservedRunningTime="2026-04-16 22:22:57.06457222 +0000 UTC m=+569.547091355" Apr 16 22:23:00.017520 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:00.017482 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-wrrnt" Apr 16 22:23:06.045844 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:06.045807 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-88ww9" Apr 16 22:23:08.053474 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:08.053446 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-22b7x" Apr 16 22:23:18.084586 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.084519 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-bc597bf69-zxj7c" podUID="055f8963-a89a-47a1-bdd6-1bea09c15863" containerName="console" containerID="cri-o://63471f637d0971a59c7928d723152acd89a1ca9bdc5d14410c4d5b1b20be5523" gracePeriod=15 Apr 16 22:23:18.319001 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.318978 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bc597bf69-zxj7c_055f8963-a89a-47a1-bdd6-1bea09c15863/console/0.log" Apr 16 22:23:18.319116 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.319040 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:23:18.388889 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.388806 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-console-config\") pod \"055f8963-a89a-47a1-bdd6-1bea09c15863\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " Apr 16 22:23:18.388889 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.388860 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-trusted-ca-bundle\") pod \"055f8963-a89a-47a1-bdd6-1bea09c15863\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " Apr 16 22:23:18.389076 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.388983 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/055f8963-a89a-47a1-bdd6-1bea09c15863-console-oauth-config\") pod \"055f8963-a89a-47a1-bdd6-1bea09c15863\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " Apr 16 22:23:18.389076 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.389052 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-oauth-serving-cert\") pod \"055f8963-a89a-47a1-bdd6-1bea09c15863\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " Apr 16 22:23:18.389145 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.389078 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-service-ca\") pod \"055f8963-a89a-47a1-bdd6-1bea09c15863\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " Apr 16 22:23:18.389145 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.389129 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/055f8963-a89a-47a1-bdd6-1bea09c15863-console-serving-cert\") pod \"055f8963-a89a-47a1-bdd6-1bea09c15863\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " Apr 16 22:23:18.389222 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.389173 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn2sj\" (UniqueName: \"kubernetes.io/projected/055f8963-a89a-47a1-bdd6-1bea09c15863-kube-api-access-bn2sj\") pod \"055f8963-a89a-47a1-bdd6-1bea09c15863\" (UID: \"055f8963-a89a-47a1-bdd6-1bea09c15863\") " Apr 16 22:23:18.389273 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.389254 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-console-config" (OuterVolumeSpecName: "console-config") pod "055f8963-a89a-47a1-bdd6-1bea09c15863" (UID: "055f8963-a89a-47a1-bdd6-1bea09c15863"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:23:18.389332 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.389317 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "055f8963-a89a-47a1-bdd6-1bea09c15863" (UID: "055f8963-a89a-47a1-bdd6-1bea09c15863"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:23:18.389433 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.389397 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "055f8963-a89a-47a1-bdd6-1bea09c15863" (UID: "055f8963-a89a-47a1-bdd6-1bea09c15863"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:23:18.389564 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.389454 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-service-ca" (OuterVolumeSpecName: "service-ca") pod "055f8963-a89a-47a1-bdd6-1bea09c15863" (UID: "055f8963-a89a-47a1-bdd6-1bea09c15863"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:23:18.389564 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.389473 2562 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-console-config\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:23:18.389564 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.389504 2562 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-trusted-ca-bundle\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:23:18.391156 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.391125 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/055f8963-a89a-47a1-bdd6-1bea09c15863-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "055f8963-a89a-47a1-bdd6-1bea09c15863" (UID: "055f8963-a89a-47a1-bdd6-1bea09c15863"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:23:18.391291 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.391273 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/055f8963-a89a-47a1-bdd6-1bea09c15863-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "055f8963-a89a-47a1-bdd6-1bea09c15863" (UID: "055f8963-a89a-47a1-bdd6-1bea09c15863"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:23:18.391291 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.391280 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/055f8963-a89a-47a1-bdd6-1bea09c15863-kube-api-access-bn2sj" (OuterVolumeSpecName: "kube-api-access-bn2sj") pod "055f8963-a89a-47a1-bdd6-1bea09c15863" (UID: "055f8963-a89a-47a1-bdd6-1bea09c15863"). InnerVolumeSpecName "kube-api-access-bn2sj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:23:18.490723 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.490680 2562 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/055f8963-a89a-47a1-bdd6-1bea09c15863-console-oauth-config\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:23:18.490723 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.490715 2562 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-oauth-serving-cert\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:23:18.490723 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.490729 2562 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/055f8963-a89a-47a1-bdd6-1bea09c15863-service-ca\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:23:18.490958 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.490742 2562 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/055f8963-a89a-47a1-bdd6-1bea09c15863-console-serving-cert\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:23:18.490958 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:18.490755 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bn2sj\" (UniqueName: \"kubernetes.io/projected/055f8963-a89a-47a1-bdd6-1bea09c15863-kube-api-access-bn2sj\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:23:19.135325 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:19.135298 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bc597bf69-zxj7c_055f8963-a89a-47a1-bdd6-1bea09c15863/console/0.log" Apr 16 22:23:19.135808 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:19.135343 2562 generic.go:358] "Generic (PLEG): container finished" podID="055f8963-a89a-47a1-bdd6-1bea09c15863" containerID="63471f637d0971a59c7928d723152acd89a1ca9bdc5d14410c4d5b1b20be5523" exitCode=2 Apr 16 22:23:19.135808 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:19.135412 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bc597bf69-zxj7c" Apr 16 22:23:19.135808 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:19.135416 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bc597bf69-zxj7c" event={"ID":"055f8963-a89a-47a1-bdd6-1bea09c15863","Type":"ContainerDied","Data":"63471f637d0971a59c7928d723152acd89a1ca9bdc5d14410c4d5b1b20be5523"} Apr 16 22:23:19.135808 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:19.135453 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bc597bf69-zxj7c" event={"ID":"055f8963-a89a-47a1-bdd6-1bea09c15863","Type":"ContainerDied","Data":"774f49928f0279583c46b4aec8c1dc03c0a0c1f113156f5983b3391c795395bf"} Apr 16 22:23:19.135808 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:19.135476 2562 scope.go:117] "RemoveContainer" containerID="63471f637d0971a59c7928d723152acd89a1ca9bdc5d14410c4d5b1b20be5523" Apr 16 22:23:19.146230 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:19.146210 2562 scope.go:117] "RemoveContainer" containerID="63471f637d0971a59c7928d723152acd89a1ca9bdc5d14410c4d5b1b20be5523" Apr 16 22:23:19.146496 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:23:19.146480 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63471f637d0971a59c7928d723152acd89a1ca9bdc5d14410c4d5b1b20be5523\": container with ID starting with 63471f637d0971a59c7928d723152acd89a1ca9bdc5d14410c4d5b1b20be5523 not found: ID does not exist" containerID="63471f637d0971a59c7928d723152acd89a1ca9bdc5d14410c4d5b1b20be5523" Apr 16 22:23:19.146536 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:19.146505 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63471f637d0971a59c7928d723152acd89a1ca9bdc5d14410c4d5b1b20be5523"} err="failed to get container status \"63471f637d0971a59c7928d723152acd89a1ca9bdc5d14410c4d5b1b20be5523\": rpc error: code = NotFound desc = could not find container \"63471f637d0971a59c7928d723152acd89a1ca9bdc5d14410c4d5b1b20be5523\": container with ID starting with 63471f637d0971a59c7928d723152acd89a1ca9bdc5d14410c4d5b1b20be5523 not found: ID does not exist" Apr 16 22:23:19.159072 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:19.159040 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bc597bf69-zxj7c"] Apr 16 22:23:19.164403 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:19.164380 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-bc597bf69-zxj7c"] Apr 16 22:23:20.130814 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:23:20.130777 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="055f8963-a89a-47a1-bdd6-1bea09c15863" path="/var/lib/kubelet/pods/055f8963-a89a-47a1-bdd6-1bea09c15863/volumes" Apr 16 22:24:11.886356 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:11.886320 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq"] Apr 16 22:24:11.886783 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:11.886646 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="055f8963-a89a-47a1-bdd6-1bea09c15863" containerName="console" Apr 16 22:24:11.886783 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:11.886662 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="055f8963-a89a-47a1-bdd6-1bea09c15863" containerName="console" Apr 16 22:24:11.886783 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:11.886747 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="055f8963-a89a-47a1-bdd6-1bea09c15863" containerName="console" Apr 16 22:24:11.889484 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:11.889468 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:11.899083 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:11.899052 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq"] Apr 16 22:24:12.024121 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.024092 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c4dc88be-3703-4f37-817c-53ef9e2bd820-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.024303 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.024143 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c4dc88be-3703-4f37-817c-53ef9e2bd820-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.024303 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.024197 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6pxl\" (UniqueName: \"kubernetes.io/projected/c4dc88be-3703-4f37-817c-53ef9e2bd820-kube-api-access-g6pxl\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.024303 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.024250 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c4dc88be-3703-4f37-817c-53ef9e2bd820-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.024303 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.024273 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c4dc88be-3703-4f37-817c-53ef9e2bd820-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.024478 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.024308 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c4dc88be-3703-4f37-817c-53ef9e2bd820-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.024478 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.024348 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c4dc88be-3703-4f37-817c-53ef9e2bd820-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.125761 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.125730 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c4dc88be-3703-4f37-817c-53ef9e2bd820-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.125918 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.125777 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6pxl\" (UniqueName: \"kubernetes.io/projected/c4dc88be-3703-4f37-817c-53ef9e2bd820-kube-api-access-g6pxl\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.125918 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.125828 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c4dc88be-3703-4f37-817c-53ef9e2bd820-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.125918 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.125853 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c4dc88be-3703-4f37-817c-53ef9e2bd820-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.125918 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.125883 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c4dc88be-3703-4f37-817c-53ef9e2bd820-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.126127 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.125935 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c4dc88be-3703-4f37-817c-53ef9e2bd820-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.126190 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.126151 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c4dc88be-3703-4f37-817c-53ef9e2bd820-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.126494 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.126464 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c4dc88be-3703-4f37-817c-53ef9e2bd820-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.128649 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.128623 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c4dc88be-3703-4f37-817c-53ef9e2bd820-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.128806 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.128783 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c4dc88be-3703-4f37-817c-53ef9e2bd820-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.129042 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.129024 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c4dc88be-3703-4f37-817c-53ef9e2bd820-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.129196 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.129174 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c4dc88be-3703-4f37-817c-53ef9e2bd820-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.134297 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.134271 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c4dc88be-3703-4f37-817c-53ef9e2bd820-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.134496 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.134480 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6pxl\" (UniqueName: \"kubernetes.io/projected/c4dc88be-3703-4f37-817c-53ef9e2bd820-kube-api-access-g6pxl\") pod \"istiod-openshift-gateway-55ff986f96-lsndq\" (UID: \"c4dc88be-3703-4f37-817c-53ef9e2bd820\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.198521 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.198442 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:12.352807 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.352781 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:24:12.353372 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.353325 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 22:24:12.353461 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.353447 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 22:24:12.356407 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:12.356381 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq"] Apr 16 22:24:13.345478 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.345446 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" event={"ID":"c4dc88be-3703-4f37-817c-53ef9e2bd820","Type":"ContainerStarted","Data":"ea07ac9943bfb96d249d94d282a75d63cc3531ee42458c4b130ebc4ec81a6d4c"} Apr 16 22:24:13.345478 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.345481 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" event={"ID":"c4dc88be-3703-4f37-817c-53ef9e2bd820","Type":"ContainerStarted","Data":"3fe1c3db9e70923cca0ebf4334086574edb59a049b192bf218d3d86f8eef8ec4"} Apr 16 22:24:13.345994 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.345628 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:13.347161 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.347139 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" Apr 16 22:24:13.378809 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.378760 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-lsndq" podStartSLOduration=2.378742755 podStartE2EDuration="2.378742755s" podCreationTimestamp="2026-04-16 22:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:24:13.377417962 +0000 UTC m=+645.859937114" watchObservedRunningTime="2026-04-16 22:24:13.378742755 +0000 UTC m=+645.861261890" Apr 16 22:24:13.456872 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.456813 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9"] Apr 16 22:24:13.457360 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.457325 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" podUID="7376575f-dab8-4278-8bd8-882f1ae8c7f7" containerName="discovery" containerID="cri-o://69e747db26a1eb57d594d33affb73b4559daa668a006bde12cb5a64cffd2860a" gracePeriod=30 Apr 16 22:24:13.715210 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.715189 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:24:13.846536 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.846504 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/7376575f-dab8-4278-8bd8-882f1ae8c7f7-cacerts\") pod \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " Apr 16 22:24:13.846536 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.846540 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-token\") pod \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " Apr 16 22:24:13.846787 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.846567 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/7376575f-dab8-4278-8bd8-882f1ae8c7f7-local-certs\") pod \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " Apr 16 22:24:13.846787 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.846585 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-csr-ca-configmap\") pod \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " Apr 16 22:24:13.846787 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.846626 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcr76\" (UniqueName: \"kubernetes.io/projected/7376575f-dab8-4278-8bd8-882f1ae8c7f7-kube-api-access-bcr76\") pod \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " Apr 16 22:24:13.846938 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.846840 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-csr-dns-cert\") pod \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " Apr 16 22:24:13.846938 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.846922 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-kubeconfig\") pod \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\" (UID: \"7376575f-dab8-4278-8bd8-882f1ae8c7f7\") " Apr 16 22:24:13.847155 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.847051 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "7376575f-dab8-4278-8bd8-882f1ae8c7f7" (UID: "7376575f-dab8-4278-8bd8-882f1ae8c7f7"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:24:13.847291 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.847243 2562 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-csr-ca-configmap\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:24:13.849342 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.849285 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "7376575f-dab8-4278-8bd8-882f1ae8c7f7" (UID: "7376575f-dab8-4278-8bd8-882f1ae8c7f7"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:24:13.849429 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.849360 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7376575f-dab8-4278-8bd8-882f1ae8c7f7-kube-api-access-bcr76" (OuterVolumeSpecName: "kube-api-access-bcr76") pod "7376575f-dab8-4278-8bd8-882f1ae8c7f7" (UID: "7376575f-dab8-4278-8bd8-882f1ae8c7f7"). InnerVolumeSpecName "kube-api-access-bcr76". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:24:13.849477 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.849460 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "7376575f-dab8-4278-8bd8-882f1ae8c7f7" (UID: "7376575f-dab8-4278-8bd8-882f1ae8c7f7"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:24:13.849517 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.849469 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-token" (OuterVolumeSpecName: "istio-token") pod "7376575f-dab8-4278-8bd8-882f1ae8c7f7" (UID: "7376575f-dab8-4278-8bd8-882f1ae8c7f7"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:24:13.849517 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.849483 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7376575f-dab8-4278-8bd8-882f1ae8c7f7-cacerts" (OuterVolumeSpecName: "cacerts") pod "7376575f-dab8-4278-8bd8-882f1ae8c7f7" (UID: "7376575f-dab8-4278-8bd8-882f1ae8c7f7"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:24:13.849788 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.849767 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7376575f-dab8-4278-8bd8-882f1ae8c7f7-local-certs" (OuterVolumeSpecName: "local-certs") pod "7376575f-dab8-4278-8bd8-882f1ae8c7f7" (UID: "7376575f-dab8-4278-8bd8-882f1ae8c7f7"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:24:13.948258 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.948200 2562 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/7376575f-dab8-4278-8bd8-882f1ae8c7f7-local-certs\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:24:13.948258 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.948224 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bcr76\" (UniqueName: \"kubernetes.io/projected/7376575f-dab8-4278-8bd8-882f1ae8c7f7-kube-api-access-bcr76\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:24:13.948258 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.948235 2562 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-csr-dns-cert\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:24:13.948258 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.948243 2562 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-kubeconfig\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:24:13.948258 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.948252 2562 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/7376575f-dab8-4278-8bd8-882f1ae8c7f7-cacerts\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:24:13.948258 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:13.948260 2562 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7376575f-dab8-4278-8bd8-882f1ae8c7f7-istio-token\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:24:14.350377 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:14.350341 2562 generic.go:358] "Generic (PLEG): container finished" podID="7376575f-dab8-4278-8bd8-882f1ae8c7f7" containerID="69e747db26a1eb57d594d33affb73b4559daa668a006bde12cb5a64cffd2860a" exitCode=0 Apr 16 22:24:14.350821 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:14.350401 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" Apr 16 22:24:14.350821 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:14.350424 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" event={"ID":"7376575f-dab8-4278-8bd8-882f1ae8c7f7","Type":"ContainerDied","Data":"69e747db26a1eb57d594d33affb73b4559daa668a006bde12cb5a64cffd2860a"} Apr 16 22:24:14.350821 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:14.350484 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9" event={"ID":"7376575f-dab8-4278-8bd8-882f1ae8c7f7","Type":"ContainerDied","Data":"5579c9bc8a4d57adbbbfdbcbd513392c611afa69698e6412d34fb2a758f3a471"} Apr 16 22:24:14.350821 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:14.350500 2562 scope.go:117] "RemoveContainer" containerID="69e747db26a1eb57d594d33affb73b4559daa668a006bde12cb5a64cffd2860a" Apr 16 22:24:14.358857 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:14.358840 2562 scope.go:117] "RemoveContainer" containerID="69e747db26a1eb57d594d33affb73b4559daa668a006bde12cb5a64cffd2860a" Apr 16 22:24:14.359127 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:24:14.359109 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69e747db26a1eb57d594d33affb73b4559daa668a006bde12cb5a64cffd2860a\": container with ID starting with 69e747db26a1eb57d594d33affb73b4559daa668a006bde12cb5a64cffd2860a not found: ID does not exist" containerID="69e747db26a1eb57d594d33affb73b4559daa668a006bde12cb5a64cffd2860a" Apr 16 22:24:14.359202 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:14.359134 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e747db26a1eb57d594d33affb73b4559daa668a006bde12cb5a64cffd2860a"} err="failed to get container status \"69e747db26a1eb57d594d33affb73b4559daa668a006bde12cb5a64cffd2860a\": rpc error: code = NotFound desc = could not find container \"69e747db26a1eb57d594d33affb73b4559daa668a006bde12cb5a64cffd2860a\": container with ID starting with 69e747db26a1eb57d594d33affb73b4559daa668a006bde12cb5a64cffd2860a not found: ID does not exist" Apr 16 22:24:14.382951 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:14.382928 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9"] Apr 16 22:24:14.395898 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:14.395871 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l2gh9"] Apr 16 22:24:16.131160 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:16.131125 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7376575f-dab8-4278-8bd8-882f1ae8c7f7" path="/var/lib/kubelet/pods/7376575f-dab8-4278-8bd8-882f1ae8c7f7/volumes" Apr 16 22:24:20.922812 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:20.922773 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-pl5vk"] Apr 16 22:24:20.923173 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:20.923098 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7376575f-dab8-4278-8bd8-882f1ae8c7f7" containerName="discovery" Apr 16 22:24:20.923173 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:20.923109 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="7376575f-dab8-4278-8bd8-882f1ae8c7f7" containerName="discovery" Apr 16 22:24:20.923173 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:20.923162 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="7376575f-dab8-4278-8bd8-882f1ae8c7f7" containerName="discovery" Apr 16 22:24:20.925989 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:20.925970 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-pl5vk" Apr 16 22:24:20.928399 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:20.928373 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 22:24:20.928519 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:20.928409 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-lfzqp\"" Apr 16 22:24:20.928575 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:20.928533 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 22:24:20.929150 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:20.929134 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 22:24:20.935235 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:20.935214 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-pl5vk"] Apr 16 22:24:21.008331 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:21.008296 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/67da3a4d-636d-4917-9c9a-1e717a8394bb-data\") pod \"seaweedfs-86cc847c5c-pl5vk\" (UID: \"67da3a4d-636d-4917-9c9a-1e717a8394bb\") " pod="kserve/seaweedfs-86cc847c5c-pl5vk" Apr 16 22:24:21.008483 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:21.008351 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8l5h\" (UniqueName: \"kubernetes.io/projected/67da3a4d-636d-4917-9c9a-1e717a8394bb-kube-api-access-p8l5h\") pod \"seaweedfs-86cc847c5c-pl5vk\" (UID: \"67da3a4d-636d-4917-9c9a-1e717a8394bb\") " pod="kserve/seaweedfs-86cc847c5c-pl5vk" Apr 16 22:24:21.109600 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:21.109563 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/67da3a4d-636d-4917-9c9a-1e717a8394bb-data\") pod \"seaweedfs-86cc847c5c-pl5vk\" (UID: \"67da3a4d-636d-4917-9c9a-1e717a8394bb\") " pod="kserve/seaweedfs-86cc847c5c-pl5vk" Apr 16 22:24:21.109766 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:21.109646 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8l5h\" (UniqueName: \"kubernetes.io/projected/67da3a4d-636d-4917-9c9a-1e717a8394bb-kube-api-access-p8l5h\") pod \"seaweedfs-86cc847c5c-pl5vk\" (UID: \"67da3a4d-636d-4917-9c9a-1e717a8394bb\") " pod="kserve/seaweedfs-86cc847c5c-pl5vk" Apr 16 22:24:21.109945 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:21.109927 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/67da3a4d-636d-4917-9c9a-1e717a8394bb-data\") pod \"seaweedfs-86cc847c5c-pl5vk\" (UID: \"67da3a4d-636d-4917-9c9a-1e717a8394bb\") " pod="kserve/seaweedfs-86cc847c5c-pl5vk" Apr 16 22:24:21.118961 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:21.118932 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8l5h\" (UniqueName: \"kubernetes.io/projected/67da3a4d-636d-4917-9c9a-1e717a8394bb-kube-api-access-p8l5h\") pod \"seaweedfs-86cc847c5c-pl5vk\" (UID: \"67da3a4d-636d-4917-9c9a-1e717a8394bb\") " pod="kserve/seaweedfs-86cc847c5c-pl5vk" Apr 16 22:24:21.235461 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:21.235381 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-pl5vk" Apr 16 22:24:21.355742 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:21.355715 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-pl5vk"] Apr 16 22:24:21.357414 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:24:21.357386 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67da3a4d_636d_4917_9c9a_1e717a8394bb.slice/crio-b81a93fea1006a77d56fc58ab151d006d38d6b32e69f9212138e468ec2bd6444 WatchSource:0}: Error finding container b81a93fea1006a77d56fc58ab151d006d38d6b32e69f9212138e468ec2bd6444: Status 404 returned error can't find the container with id b81a93fea1006a77d56fc58ab151d006d38d6b32e69f9212138e468ec2bd6444 Apr 16 22:24:21.377084 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:21.377052 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-pl5vk" event={"ID":"67da3a4d-636d-4917-9c9a-1e717a8394bb","Type":"ContainerStarted","Data":"b81a93fea1006a77d56fc58ab151d006d38d6b32e69f9212138e468ec2bd6444"} Apr 16 22:24:24.392291 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:24.392250 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-pl5vk" event={"ID":"67da3a4d-636d-4917-9c9a-1e717a8394bb","Type":"ContainerStarted","Data":"e4ead42481b00091c17f8dde59ecfb9a5c1271ea362b18e2507b2058d9ea59dd"} Apr 16 22:24:24.392693 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:24.392396 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-pl5vk" Apr 16 22:24:24.407370 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:24.407325 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-pl5vk" podStartSLOduration=1.8490826820000001 podStartE2EDuration="4.407313062s" podCreationTimestamp="2026-04-16 22:24:20 +0000 UTC" firstStartedPulling="2026-04-16 22:24:21.3588105 +0000 UTC m=+653.841329614" lastFinishedPulling="2026-04-16 22:24:23.917040863 +0000 UTC m=+656.399559994" observedRunningTime="2026-04-16 22:24:24.405902687 +0000 UTC m=+656.888421825" watchObservedRunningTime="2026-04-16 22:24:24.407313062 +0000 UTC m=+656.889832197" Apr 16 22:24:30.398004 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:24:30.397970 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-pl5vk" Apr 16 22:25:32.318210 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.318091 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-zm8l6"] Apr 16 22:25:32.321558 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.321532 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-zm8l6" Apr 16 22:25:32.323718 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.323697 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 22:25:32.323827 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.323703 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-v9ggz\"" Apr 16 22:25:32.332640 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.332617 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-zm8l6"] Apr 16 22:25:32.334981 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.334957 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-ftrm9"] Apr 16 22:25:32.345470 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.345446 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-ftrm9"] Apr 16 22:25:32.345617 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.345583 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-ftrm9" Apr 16 22:25:32.347967 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.347947 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-bqjdn\"" Apr 16 22:25:32.347967 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.347958 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 22:25:32.456410 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.456380 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb6fj\" (UniqueName: \"kubernetes.io/projected/ff881812-87ac-4176-bad1-2d5b98e46069-kube-api-access-qb6fj\") pod \"odh-model-controller-696fc77849-ftrm9\" (UID: \"ff881812-87ac-4176-bad1-2d5b98e46069\") " pod="kserve/odh-model-controller-696fc77849-ftrm9" Apr 16 22:25:32.456410 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.456413 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9xck\" (UniqueName: \"kubernetes.io/projected/e2f0453c-f168-46f3-9c2b-6a1250bc1db5-kube-api-access-p9xck\") pod \"model-serving-api-86f7b4b499-zm8l6\" (UID: \"e2f0453c-f168-46f3-9c2b-6a1250bc1db5\") " pod="kserve/model-serving-api-86f7b4b499-zm8l6" Apr 16 22:25:32.456587 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.456442 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f0453c-f168-46f3-9c2b-6a1250bc1db5-tls-certs\") pod \"model-serving-api-86f7b4b499-zm8l6\" (UID: \"e2f0453c-f168-46f3-9c2b-6a1250bc1db5\") " pod="kserve/model-serving-api-86f7b4b499-zm8l6" Apr 16 22:25:32.456587 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.456562 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff881812-87ac-4176-bad1-2d5b98e46069-cert\") pod \"odh-model-controller-696fc77849-ftrm9\" (UID: \"ff881812-87ac-4176-bad1-2d5b98e46069\") " pod="kserve/odh-model-controller-696fc77849-ftrm9" Apr 16 22:25:32.557946 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.557913 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f0453c-f168-46f3-9c2b-6a1250bc1db5-tls-certs\") pod \"model-serving-api-86f7b4b499-zm8l6\" (UID: \"e2f0453c-f168-46f3-9c2b-6a1250bc1db5\") " pod="kserve/model-serving-api-86f7b4b499-zm8l6" Apr 16 22:25:32.558100 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.558066 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff881812-87ac-4176-bad1-2d5b98e46069-cert\") pod \"odh-model-controller-696fc77849-ftrm9\" (UID: \"ff881812-87ac-4176-bad1-2d5b98e46069\") " pod="kserve/odh-model-controller-696fc77849-ftrm9" Apr 16 22:25:32.558153 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.558099 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qb6fj\" (UniqueName: \"kubernetes.io/projected/ff881812-87ac-4176-bad1-2d5b98e46069-kube-api-access-qb6fj\") pod \"odh-model-controller-696fc77849-ftrm9\" (UID: \"ff881812-87ac-4176-bad1-2d5b98e46069\") " pod="kserve/odh-model-controller-696fc77849-ftrm9" Apr 16 22:25:32.558153 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.558129 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9xck\" (UniqueName: \"kubernetes.io/projected/e2f0453c-f168-46f3-9c2b-6a1250bc1db5-kube-api-access-p9xck\") pod \"model-serving-api-86f7b4b499-zm8l6\" (UID: \"e2f0453c-f168-46f3-9c2b-6a1250bc1db5\") " pod="kserve/model-serving-api-86f7b4b499-zm8l6" Apr 16 22:25:32.560215 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.560189 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e2f0453c-f168-46f3-9c2b-6a1250bc1db5-tls-certs\") pod \"model-serving-api-86f7b4b499-zm8l6\" (UID: \"e2f0453c-f168-46f3-9c2b-6a1250bc1db5\") " pod="kserve/model-serving-api-86f7b4b499-zm8l6" Apr 16 22:25:32.560459 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.560439 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff881812-87ac-4176-bad1-2d5b98e46069-cert\") pod \"odh-model-controller-696fc77849-ftrm9\" (UID: \"ff881812-87ac-4176-bad1-2d5b98e46069\") " pod="kserve/odh-model-controller-696fc77849-ftrm9" Apr 16 22:25:32.566656 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.566632 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb6fj\" (UniqueName: \"kubernetes.io/projected/ff881812-87ac-4176-bad1-2d5b98e46069-kube-api-access-qb6fj\") pod \"odh-model-controller-696fc77849-ftrm9\" (UID: \"ff881812-87ac-4176-bad1-2d5b98e46069\") " pod="kserve/odh-model-controller-696fc77849-ftrm9" Apr 16 22:25:32.566942 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.566920 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9xck\" (UniqueName: \"kubernetes.io/projected/e2f0453c-f168-46f3-9c2b-6a1250bc1db5-kube-api-access-p9xck\") pod \"model-serving-api-86f7b4b499-zm8l6\" (UID: \"e2f0453c-f168-46f3-9c2b-6a1250bc1db5\") " pod="kserve/model-serving-api-86f7b4b499-zm8l6" Apr 16 22:25:32.633597 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.633533 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-zm8l6" Apr 16 22:25:32.656681 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.656655 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-ftrm9" Apr 16 22:25:32.765567 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.765545 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-zm8l6"] Apr 16 22:25:32.768851 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:25:32.768824 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2f0453c_f168_46f3_9c2b_6a1250bc1db5.slice/crio-d6721f8e42f0d5b28a56d5647a2fe11c0d3e6104eff7401acfcd14107068308e WatchSource:0}: Error finding container d6721f8e42f0d5b28a56d5647a2fe11c0d3e6104eff7401acfcd14107068308e: Status 404 returned error can't find the container with id d6721f8e42f0d5b28a56d5647a2fe11c0d3e6104eff7401acfcd14107068308e Apr 16 22:25:32.787880 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:32.787859 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-ftrm9"] Apr 16 22:25:32.789573 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:25:32.789549 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff881812_87ac_4176_bad1_2d5b98e46069.slice/crio-bdfcdf5e78081bbd421894068bbc650c811d443b17c227bab1cee0c326d3fb91 WatchSource:0}: Error finding container bdfcdf5e78081bbd421894068bbc650c811d443b17c227bab1cee0c326d3fb91: Status 404 returned error can't find the container with id bdfcdf5e78081bbd421894068bbc650c811d443b17c227bab1cee0c326d3fb91 Apr 16 22:25:33.663712 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:33.663667 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-ftrm9" event={"ID":"ff881812-87ac-4176-bad1-2d5b98e46069","Type":"ContainerStarted","Data":"bdfcdf5e78081bbd421894068bbc650c811d443b17c227bab1cee0c326d3fb91"} Apr 16 22:25:33.666031 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:33.665975 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-zm8l6" event={"ID":"e2f0453c-f168-46f3-9c2b-6a1250bc1db5","Type":"ContainerStarted","Data":"d6721f8e42f0d5b28a56d5647a2fe11c0d3e6104eff7401acfcd14107068308e"} Apr 16 22:25:36.681320 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:36.681283 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-ftrm9" event={"ID":"ff881812-87ac-4176-bad1-2d5b98e46069","Type":"ContainerStarted","Data":"9ae09725ee2da015adf365a8b93385a75bf6262475e79e5b3ddc8e271d29c280"} Apr 16 22:25:36.681320 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:36.681337 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-ftrm9" Apr 16 22:25:36.682646 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:36.682622 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-zm8l6" event={"ID":"e2f0453c-f168-46f3-9c2b-6a1250bc1db5","Type":"ContainerStarted","Data":"cca2aee0d7f4998b451da012a2e630f082f715487044e0681c4a10633b8fb4cb"} Apr 16 22:25:36.682767 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:36.682744 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-zm8l6" Apr 16 22:25:36.697708 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:36.697663 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-ftrm9" podStartSLOduration=0.954271227 podStartE2EDuration="4.697648122s" podCreationTimestamp="2026-04-16 22:25:32 +0000 UTC" firstStartedPulling="2026-04-16 22:25:32.790784718 +0000 UTC m=+725.273303830" lastFinishedPulling="2026-04-16 22:25:36.534161608 +0000 UTC m=+729.016680725" observedRunningTime="2026-04-16 22:25:36.695581799 +0000 UTC m=+729.178100933" watchObservedRunningTime="2026-04-16 22:25:36.697648122 +0000 UTC m=+729.180167257" Apr 16 22:25:36.712084 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:36.712032 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-zm8l6" podStartSLOduration=0.952870666 podStartE2EDuration="4.712016862s" podCreationTimestamp="2026-04-16 22:25:32 +0000 UTC" firstStartedPulling="2026-04-16 22:25:32.771139301 +0000 UTC m=+725.253658413" lastFinishedPulling="2026-04-16 22:25:36.530285493 +0000 UTC m=+729.012804609" observedRunningTime="2026-04-16 22:25:36.71004185 +0000 UTC m=+729.192560988" watchObservedRunningTime="2026-04-16 22:25:36.712016862 +0000 UTC m=+729.194535996" Apr 16 22:25:47.690180 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:47.690146 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-ftrm9" Apr 16 22:25:47.692930 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:25:47.692907 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-zm8l6" Apr 16 22:26:19.174945 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.174906 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s"] Apr 16 22:26:19.184477 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.184452 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.189806 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.187377 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:26:19.189806 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.187820 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 22:26:19.189806 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.188063 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pt4qs\"" Apr 16 22:26:19.189806 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.188572 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 22:26:19.196030 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.192522 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s"] Apr 16 22:26:19.242873 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.242836 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-model-cache\") pod \"scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.242873 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.242872 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.243074 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.243023 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5w7f\" (UniqueName: \"kubernetes.io/projected/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-kube-api-access-g5w7f\") pod \"scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.243074 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.243060 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-dshm\") pod \"scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.243308 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.243287 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-home\") pod \"scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.243401 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.243323 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.344011 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.343975 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5w7f\" (UniqueName: \"kubernetes.io/projected/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-kube-api-access-g5w7f\") pod \"scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.344011 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.344014 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-dshm\") pod \"scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.344237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.344057 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-home\") pod \"scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.344237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.344074 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.344237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.344113 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-model-cache\") pod \"scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.344237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.344139 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.344520 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.344493 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-home\") pod \"scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.344591 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.344566 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.344591 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.344583 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-model-cache\") pod \"scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.346547 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.346518 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-dshm\") pod \"scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.346725 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.346709 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.352577 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.352521 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5w7f\" (UniqueName: \"kubernetes.io/projected/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-kube-api-access-g5w7f\") pod \"scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.501376 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.501291 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:19.634105 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.634072 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s"] Apr 16 22:26:19.635702 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:26:19.635667 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77f52587_ec6d_497c_bd37_a6d2ca48f5c2.slice/crio-048b325797a498d31ced3bfcc79585bad208b1d2e4b57e495c09592ff448aad7 WatchSource:0}: Error finding container 048b325797a498d31ced3bfcc79585bad208b1d2e4b57e495c09592ff448aad7: Status 404 returned error can't find the container with id 048b325797a498d31ced3bfcc79585bad208b1d2e4b57e495c09592ff448aad7 Apr 16 22:26:19.854192 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:19.854158 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" event={"ID":"77f52587-ec6d-497c-bd37-a6d2ca48f5c2","Type":"ContainerStarted","Data":"048b325797a498d31ced3bfcc79585bad208b1d2e4b57e495c09592ff448aad7"} Apr 16 22:26:22.869923 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:22.869885 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" event={"ID":"77f52587-ec6d-497c-bd37-a6d2ca48f5c2","Type":"ContainerStarted","Data":"6dbdcea45f7d702860c844d8e7a6b3dd01c063af37bb12a5437a570330830949"} Apr 16 22:26:27.894517 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:27.894477 2562 generic.go:358] "Generic (PLEG): container finished" podID="77f52587-ec6d-497c-bd37-a6d2ca48f5c2" containerID="6dbdcea45f7d702860c844d8e7a6b3dd01c063af37bb12a5437a570330830949" exitCode=0 Apr 16 22:26:27.894910 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:27.894541 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" event={"ID":"77f52587-ec6d-497c-bd37-a6d2ca48f5c2","Type":"ContainerDied","Data":"6dbdcea45f7d702860c844d8e7a6b3dd01c063af37bb12a5437a570330830949"} Apr 16 22:26:29.906023 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:29.905987 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" event={"ID":"77f52587-ec6d-497c-bd37-a6d2ca48f5c2","Type":"ContainerStarted","Data":"5fc22b7ee79e9cd0932afa55f4b30b2dc746666d2edd76fb2a3957ebdfda9a59"} Apr 16 22:26:29.926612 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:29.926563 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" podStartSLOduration=1.582788224 podStartE2EDuration="10.92654976s" podCreationTimestamp="2026-04-16 22:26:19 +0000 UTC" firstStartedPulling="2026-04-16 22:26:19.638306554 +0000 UTC m=+772.120825667" lastFinishedPulling="2026-04-16 22:26:28.982068073 +0000 UTC m=+781.464587203" observedRunningTime="2026-04-16 22:26:29.924357288 +0000 UTC m=+782.406876427" watchObservedRunningTime="2026-04-16 22:26:29.92654976 +0000 UTC m=+782.409068895" Apr 16 22:26:39.501973 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:39.501939 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:39.502440 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:39.502011 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:39.514650 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:39.514624 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:26:39.953433 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:26:39.953410 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:27:19.773021 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.772984 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx"] Apr 16 22:27:19.777630 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.777589 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:19.780186 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.780162 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-f6x7q\"" Apr 16 22:27:19.780325 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.780234 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 22:27:19.784003 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.783978 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx"] Apr 16 22:27:19.842781 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.842746 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:19.842937 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.842787 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:19.842937 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.842812 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:19.842937 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.842888 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj58t\" (UniqueName: \"kubernetes.io/projected/f27e2ecd-4776-4122-858f-dc3c740648d5-kube-api-access-cj58t\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:19.842937 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.842930 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:19.843087 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.842959 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f27e2ecd-4776-4122-858f-dc3c740648d5-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:19.943488 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.943449 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cj58t\" (UniqueName: \"kubernetes.io/projected/f27e2ecd-4776-4122-858f-dc3c740648d5-kube-api-access-cj58t\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:19.943710 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.943504 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:19.943710 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.943540 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f27e2ecd-4776-4122-858f-dc3c740648d5-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:19.943710 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.943596 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:19.943710 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.943666 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:19.943710 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.943700 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:19.943984 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.943968 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:19.944102 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.944072 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:19.944169 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.944078 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:19.944169 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.944140 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:19.946272 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.946247 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f27e2ecd-4776-4122-858f-dc3c740648d5-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:19.951993 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:19.951967 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj58t\" (UniqueName: \"kubernetes.io/projected/f27e2ecd-4776-4122-858f-dc3c740648d5-kube-api-access-cj58t\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:20.089337 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:20.089310 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:20.220962 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:20.220935 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx"] Apr 16 22:27:20.222572 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:27:20.222536 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf27e2ecd_4776_4122_858f_dc3c740648d5.slice/crio-ee7d2b9aaea06e036491108897757b1b06da5da459d85fbacbf82f80a308137d WatchSource:0}: Error finding container ee7d2b9aaea06e036491108897757b1b06da5da459d85fbacbf82f80a308137d: Status 404 returned error can't find the container with id ee7d2b9aaea06e036491108897757b1b06da5da459d85fbacbf82f80a308137d Apr 16 22:27:21.102706 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.102658 2562 generic.go:358] "Generic (PLEG): container finished" podID="f27e2ecd-4776-4122-858f-dc3c740648d5" containerID="9747cd0912762244b691f1ee43f5b7c2986cdb75f1d097a6a68bf59373e1457c" exitCode=0 Apr 16 22:27:21.103148 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.102746 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" event={"ID":"f27e2ecd-4776-4122-858f-dc3c740648d5","Type":"ContainerDied","Data":"9747cd0912762244b691f1ee43f5b7c2986cdb75f1d097a6a68bf59373e1457c"} Apr 16 22:27:21.103148 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.102798 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" event={"ID":"f27e2ecd-4776-4122-858f-dc3c740648d5","Type":"ContainerStarted","Data":"ee7d2b9aaea06e036491108897757b1b06da5da459d85fbacbf82f80a308137d"} Apr 16 22:27:21.399055 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.399008 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s"] Apr 16 22:27:21.399651 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.399617 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" podUID="77f52587-ec6d-497c-bd37-a6d2ca48f5c2" containerName="main" containerID="cri-o://5fc22b7ee79e9cd0932afa55f4b30b2dc746666d2edd76fb2a3957ebdfda9a59" gracePeriod=30 Apr 16 22:27:21.666392 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.666371 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:27:21.759952 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.759918 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-tls-certs\") pod \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " Apr 16 22:27:21.759952 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.759960 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-model-cache\") pod \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " Apr 16 22:27:21.760184 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.760022 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-home\") pod \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " Apr 16 22:27:21.760184 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.760043 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-dshm\") pod \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " Apr 16 22:27:21.760184 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.760081 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-kserve-provision-location\") pod \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " Apr 16 22:27:21.760184 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.760120 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5w7f\" (UniqueName: \"kubernetes.io/projected/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-kube-api-access-g5w7f\") pod \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\" (UID: \"77f52587-ec6d-497c-bd37-a6d2ca48f5c2\") " Apr 16 22:27:21.760389 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.760326 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-model-cache" (OuterVolumeSpecName: "model-cache") pod "77f52587-ec6d-497c-bd37-a6d2ca48f5c2" (UID: "77f52587-ec6d-497c-bd37-a6d2ca48f5c2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:27:21.760389 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.760337 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-home" (OuterVolumeSpecName: "home") pod "77f52587-ec6d-497c-bd37-a6d2ca48f5c2" (UID: "77f52587-ec6d-497c-bd37-a6d2ca48f5c2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:27:21.762782 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.762747 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-dshm" (OuterVolumeSpecName: "dshm") pod "77f52587-ec6d-497c-bd37-a6d2ca48f5c2" (UID: "77f52587-ec6d-497c-bd37-a6d2ca48f5c2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:27:21.762903 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.762787 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "77f52587-ec6d-497c-bd37-a6d2ca48f5c2" (UID: "77f52587-ec6d-497c-bd37-a6d2ca48f5c2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:27:21.762903 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.762795 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-kube-api-access-g5w7f" (OuterVolumeSpecName: "kube-api-access-g5w7f") pod "77f52587-ec6d-497c-bd37-a6d2ca48f5c2" (UID: "77f52587-ec6d-497c-bd37-a6d2ca48f5c2"). InnerVolumeSpecName "kube-api-access-g5w7f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:27:21.820401 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.820363 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "77f52587-ec6d-497c-bd37-a6d2ca48f5c2" (UID: "77f52587-ec6d-497c-bd37-a6d2ca48f5c2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:27:21.861729 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.861699 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-dshm\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:27:21.861883 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.861736 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-kserve-provision-location\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:27:21.861883 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.861751 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g5w7f\" (UniqueName: \"kubernetes.io/projected/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-kube-api-access-g5w7f\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:27:21.861883 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.861761 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-tls-certs\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:27:21.861883 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.861770 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-model-cache\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:27:21.861883 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:21.861780 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/77f52587-ec6d-497c-bd37-a6d2ca48f5c2-home\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:27:22.108855 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:22.108824 2562 generic.go:358] "Generic (PLEG): container finished" podID="77f52587-ec6d-497c-bd37-a6d2ca48f5c2" containerID="5fc22b7ee79e9cd0932afa55f4b30b2dc746666d2edd76fb2a3957ebdfda9a59" exitCode=0 Apr 16 22:27:22.109150 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:22.108887 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" Apr 16 22:27:22.109150 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:22.108895 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" event={"ID":"77f52587-ec6d-497c-bd37-a6d2ca48f5c2","Type":"ContainerDied","Data":"5fc22b7ee79e9cd0932afa55f4b30b2dc746666d2edd76fb2a3957ebdfda9a59"} Apr 16 22:27:22.109150 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:22.108942 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s" event={"ID":"77f52587-ec6d-497c-bd37-a6d2ca48f5c2","Type":"ContainerDied","Data":"048b325797a498d31ced3bfcc79585bad208b1d2e4b57e495c09592ff448aad7"} Apr 16 22:27:22.109150 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:22.108958 2562 scope.go:117] "RemoveContainer" containerID="5fc22b7ee79e9cd0932afa55f4b30b2dc746666d2edd76fb2a3957ebdfda9a59" Apr 16 22:27:22.110773 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:22.110753 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" event={"ID":"f27e2ecd-4776-4122-858f-dc3c740648d5","Type":"ContainerStarted","Data":"af8df1dab8e6f4b3a76944131b0577f4656e4798eebdeb36c34ea3f30edabfc9"} Apr 16 22:27:22.119363 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:22.119341 2562 scope.go:117] "RemoveContainer" containerID="6dbdcea45f7d702860c844d8e7a6b3dd01c063af37bb12a5437a570330830949" Apr 16 22:27:22.131144 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:22.131124 2562 scope.go:117] "RemoveContainer" containerID="5fc22b7ee79e9cd0932afa55f4b30b2dc746666d2edd76fb2a3957ebdfda9a59" Apr 16 22:27:22.131452 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:27:22.131429 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fc22b7ee79e9cd0932afa55f4b30b2dc746666d2edd76fb2a3957ebdfda9a59\": container with ID starting with 5fc22b7ee79e9cd0932afa55f4b30b2dc746666d2edd76fb2a3957ebdfda9a59 not found: ID does not exist" containerID="5fc22b7ee79e9cd0932afa55f4b30b2dc746666d2edd76fb2a3957ebdfda9a59" Apr 16 22:27:22.131552 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:22.131463 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc22b7ee79e9cd0932afa55f4b30b2dc746666d2edd76fb2a3957ebdfda9a59"} err="failed to get container status \"5fc22b7ee79e9cd0932afa55f4b30b2dc746666d2edd76fb2a3957ebdfda9a59\": rpc error: code = NotFound desc = could not find container \"5fc22b7ee79e9cd0932afa55f4b30b2dc746666d2edd76fb2a3957ebdfda9a59\": container with ID starting with 5fc22b7ee79e9cd0932afa55f4b30b2dc746666d2edd76fb2a3957ebdfda9a59 not found: ID does not exist" Apr 16 22:27:22.131552 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:22.131486 2562 scope.go:117] "RemoveContainer" containerID="6dbdcea45f7d702860c844d8e7a6b3dd01c063af37bb12a5437a570330830949" Apr 16 22:27:22.131791 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:27:22.131772 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dbdcea45f7d702860c844d8e7a6b3dd01c063af37bb12a5437a570330830949\": container with ID starting with 6dbdcea45f7d702860c844d8e7a6b3dd01c063af37bb12a5437a570330830949 not found: ID does not exist" containerID="6dbdcea45f7d702860c844d8e7a6b3dd01c063af37bb12a5437a570330830949" Apr 16 22:27:22.131840 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:22.131797 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dbdcea45f7d702860c844d8e7a6b3dd01c063af37bb12a5437a570330830949"} err="failed to get container status \"6dbdcea45f7d702860c844d8e7a6b3dd01c063af37bb12a5437a570330830949\": rpc error: code = NotFound desc = could not find container \"6dbdcea45f7d702860c844d8e7a6b3dd01c063af37bb12a5437a570330830949\": container with ID starting with 6dbdcea45f7d702860c844d8e7a6b3dd01c063af37bb12a5437a570330830949 not found: ID does not exist" Apr 16 22:27:22.133370 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:22.132530 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s"] Apr 16 22:27:22.136673 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:22.136650 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-7bb5d47d47-ssr7s"] Apr 16 22:27:24.132260 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:24.132226 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77f52587-ec6d-497c-bd37-a6d2ca48f5c2" path="/var/lib/kubelet/pods/77f52587-ec6d-497c-bd37-a6d2ca48f5c2/volumes" Apr 16 22:27:51.250907 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:51.250823 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" event={"ID":"f27e2ecd-4776-4122-858f-dc3c740648d5","Type":"ContainerStarted","Data":"b5409843618abef6a503e0897a6b2a54b4c3ed1bcdffaee9709b09cba35c6195"} Apr 16 22:27:51.251339 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:51.250942 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:27:51.271560 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:51.271497 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" podStartSLOduration=2.475052014 podStartE2EDuration="32.271482807s" podCreationTimestamp="2026-04-16 22:27:19 +0000 UTC" firstStartedPulling="2026-04-16 22:27:21.10395932 +0000 UTC m=+833.586478433" lastFinishedPulling="2026-04-16 22:27:50.900390104 +0000 UTC m=+863.382909226" observedRunningTime="2026-04-16 22:27:51.269048783 +0000 UTC m=+863.751567918" watchObservedRunningTime="2026-04-16 22:27:51.271482807 +0000 UTC m=+863.754001943" Apr 16 22:27:52.257525 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:27:52.257491 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" podUID="f27e2ecd-4776-4122-858f-dc3c740648d5" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 22:28:00.089646 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:00.089595 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:28:00.090163 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:00.089657 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:28:00.091117 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:00.091095 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:28:00.091222 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:00.091149 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" podUID="f27e2ecd-4776-4122-858f-dc3c740648d5" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 22:28:00.287065 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:00.287024 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" podUID="f27e2ecd-4776-4122-858f-dc3c740648d5" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 22:28:00.287271 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:00.287255 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:28:01.291490 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:01.291457 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" podUID="f27e2ecd-4776-4122-858f-dc3c740648d5" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 22:28:02.946720 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:02.946686 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx"] Apr 16 22:28:02.947213 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:02.946973 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" podUID="f27e2ecd-4776-4122-858f-dc3c740648d5" containerName="main" containerID="cri-o://af8df1dab8e6f4b3a76944131b0577f4656e4798eebdeb36c34ea3f30edabfc9" gracePeriod=30 Apr 16 22:28:02.947213 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:02.947040 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" podUID="f27e2ecd-4776-4122-858f-dc3c740648d5" containerName="tokenizer" containerID="cri-o://b5409843618abef6a503e0897a6b2a54b4c3ed1bcdffaee9709b09cba35c6195" gracePeriod=30 Apr 16 22:28:02.948717 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:02.948684 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" podUID="f27e2ecd-4776-4122-858f-dc3c740648d5" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 22:28:03.299938 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:03.299905 2562 generic.go:358] "Generic (PLEG): container finished" podID="f27e2ecd-4776-4122-858f-dc3c740648d5" containerID="af8df1dab8e6f4b3a76944131b0577f4656e4798eebdeb36c34ea3f30edabfc9" exitCode=0 Apr 16 22:28:03.300103 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:03.299979 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" event={"ID":"f27e2ecd-4776-4122-858f-dc3c740648d5","Type":"ContainerDied","Data":"af8df1dab8e6f4b3a76944131b0577f4656e4798eebdeb36c34ea3f30edabfc9"} Apr 16 22:28:04.094931 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.094906 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:28:04.140459 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.140394 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-tokenizer-cache\") pod \"f27e2ecd-4776-4122-858f-dc3c740648d5\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " Apr 16 22:28:04.140459 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.140422 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-tokenizer-tmp\") pod \"f27e2ecd-4776-4122-858f-dc3c740648d5\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " Apr 16 22:28:04.140459 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.140451 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj58t\" (UniqueName: \"kubernetes.io/projected/f27e2ecd-4776-4122-858f-dc3c740648d5-kube-api-access-cj58t\") pod \"f27e2ecd-4776-4122-858f-dc3c740648d5\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " Apr 16 22:28:04.140732 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.140489 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-tokenizer-uds\") pod \"f27e2ecd-4776-4122-858f-dc3c740648d5\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " Apr 16 22:28:04.140732 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.140540 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f27e2ecd-4776-4122-858f-dc3c740648d5-tls-certs\") pod \"f27e2ecd-4776-4122-858f-dc3c740648d5\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " Apr 16 22:28:04.140732 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.140573 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-kserve-provision-location\") pod \"f27e2ecd-4776-4122-858f-dc3c740648d5\" (UID: \"f27e2ecd-4776-4122-858f-dc3c740648d5\") " Apr 16 22:28:04.140732 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.140655 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "f27e2ecd-4776-4122-858f-dc3c740648d5" (UID: "f27e2ecd-4776-4122-858f-dc3c740648d5"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:04.140941 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.140798 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "f27e2ecd-4776-4122-858f-dc3c740648d5" (UID: "f27e2ecd-4776-4122-858f-dc3c740648d5"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:04.140941 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.140811 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "f27e2ecd-4776-4122-858f-dc3c740648d5" (UID: "f27e2ecd-4776-4122-858f-dc3c740648d5"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:04.140941 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.140823 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-tokenizer-cache\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:28:04.141292 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.141272 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f27e2ecd-4776-4122-858f-dc3c740648d5" (UID: "f27e2ecd-4776-4122-858f-dc3c740648d5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:04.142582 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.142558 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27e2ecd-4776-4122-858f-dc3c740648d5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f27e2ecd-4776-4122-858f-dc3c740648d5" (UID: "f27e2ecd-4776-4122-858f-dc3c740648d5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:28:04.142691 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.142634 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27e2ecd-4776-4122-858f-dc3c740648d5-kube-api-access-cj58t" (OuterVolumeSpecName: "kube-api-access-cj58t") pod "f27e2ecd-4776-4122-858f-dc3c740648d5" (UID: "f27e2ecd-4776-4122-858f-dc3c740648d5"). InnerVolumeSpecName "kube-api-access-cj58t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:28:04.242194 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.242170 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f27e2ecd-4776-4122-858f-dc3c740648d5-tls-certs\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:28:04.242194 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.242193 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-kserve-provision-location\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:28:04.242486 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.242203 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-tokenizer-tmp\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:28:04.242486 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.242212 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cj58t\" (UniqueName: \"kubernetes.io/projected/f27e2ecd-4776-4122-858f-dc3c740648d5-kube-api-access-cj58t\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:28:04.242486 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.242222 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f27e2ecd-4776-4122-858f-dc3c740648d5-tokenizer-uds\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:28:04.305834 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.305799 2562 generic.go:358] "Generic (PLEG): container finished" podID="f27e2ecd-4776-4122-858f-dc3c740648d5" containerID="b5409843618abef6a503e0897a6b2a54b4c3ed1bcdffaee9709b09cba35c6195" exitCode=0 Apr 16 22:28:04.305952 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.305862 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" event={"ID":"f27e2ecd-4776-4122-858f-dc3c740648d5","Type":"ContainerDied","Data":"b5409843618abef6a503e0897a6b2a54b4c3ed1bcdffaee9709b09cba35c6195"} Apr 16 22:28:04.305952 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.305877 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" Apr 16 22:28:04.305952 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.305899 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx" event={"ID":"f27e2ecd-4776-4122-858f-dc3c740648d5","Type":"ContainerDied","Data":"ee7d2b9aaea06e036491108897757b1b06da5da459d85fbacbf82f80a308137d"} Apr 16 22:28:04.305952 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.305918 2562 scope.go:117] "RemoveContainer" containerID="b5409843618abef6a503e0897a6b2a54b4c3ed1bcdffaee9709b09cba35c6195" Apr 16 22:28:04.314755 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.314729 2562 scope.go:117] "RemoveContainer" containerID="af8df1dab8e6f4b3a76944131b0577f4656e4798eebdeb36c34ea3f30edabfc9" Apr 16 22:28:04.322135 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.322118 2562 scope.go:117] "RemoveContainer" containerID="9747cd0912762244b691f1ee43f5b7c2986cdb75f1d097a6a68bf59373e1457c" Apr 16 22:28:04.328705 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.328679 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx"] Apr 16 22:28:04.330564 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.330542 2562 scope.go:117] "RemoveContainer" containerID="b5409843618abef6a503e0897a6b2a54b4c3ed1bcdffaee9709b09cba35c6195" Apr 16 22:28:04.330984 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:28:04.330960 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5409843618abef6a503e0897a6b2a54b4c3ed1bcdffaee9709b09cba35c6195\": container with ID starting with b5409843618abef6a503e0897a6b2a54b4c3ed1bcdffaee9709b09cba35c6195 not found: ID does not exist" containerID="b5409843618abef6a503e0897a6b2a54b4c3ed1bcdffaee9709b09cba35c6195" Apr 16 22:28:04.331043 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.330993 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5409843618abef6a503e0897a6b2a54b4c3ed1bcdffaee9709b09cba35c6195"} err="failed to get container status \"b5409843618abef6a503e0897a6b2a54b4c3ed1bcdffaee9709b09cba35c6195\": rpc error: code = NotFound desc = could not find container \"b5409843618abef6a503e0897a6b2a54b4c3ed1bcdffaee9709b09cba35c6195\": container with ID starting with b5409843618abef6a503e0897a6b2a54b4c3ed1bcdffaee9709b09cba35c6195 not found: ID does not exist" Apr 16 22:28:04.331043 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.331012 2562 scope.go:117] "RemoveContainer" containerID="af8df1dab8e6f4b3a76944131b0577f4656e4798eebdeb36c34ea3f30edabfc9" Apr 16 22:28:04.331226 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:28:04.331209 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af8df1dab8e6f4b3a76944131b0577f4656e4798eebdeb36c34ea3f30edabfc9\": container with ID starting with af8df1dab8e6f4b3a76944131b0577f4656e4798eebdeb36c34ea3f30edabfc9 not found: ID does not exist" containerID="af8df1dab8e6f4b3a76944131b0577f4656e4798eebdeb36c34ea3f30edabfc9" Apr 16 22:28:04.331267 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.331230 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af8df1dab8e6f4b3a76944131b0577f4656e4798eebdeb36c34ea3f30edabfc9"} err="failed to get container status \"af8df1dab8e6f4b3a76944131b0577f4656e4798eebdeb36c34ea3f30edabfc9\": rpc error: code = NotFound desc = could not find container \"af8df1dab8e6f4b3a76944131b0577f4656e4798eebdeb36c34ea3f30edabfc9\": container with ID starting with af8df1dab8e6f4b3a76944131b0577f4656e4798eebdeb36c34ea3f30edabfc9 not found: ID does not exist" Apr 16 22:28:04.331267 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.331243 2562 scope.go:117] "RemoveContainer" containerID="9747cd0912762244b691f1ee43f5b7c2986cdb75f1d097a6a68bf59373e1457c" Apr 16 22:28:04.331432 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:28:04.331416 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9747cd0912762244b691f1ee43f5b7c2986cdb75f1d097a6a68bf59373e1457c\": container with ID starting with 9747cd0912762244b691f1ee43f5b7c2986cdb75f1d097a6a68bf59373e1457c not found: ID does not exist" containerID="9747cd0912762244b691f1ee43f5b7c2986cdb75f1d097a6a68bf59373e1457c" Apr 16 22:28:04.331476 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.331434 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9747cd0912762244b691f1ee43f5b7c2986cdb75f1d097a6a68bf59373e1457c"} err="failed to get container status \"9747cd0912762244b691f1ee43f5b7c2986cdb75f1d097a6a68bf59373e1457c\": rpc error: code = NotFound desc = could not find container \"9747cd0912762244b691f1ee43f5b7c2986cdb75f1d097a6a68bf59373e1457c\": container with ID starting with 9747cd0912762244b691f1ee43f5b7c2986cdb75f1d097a6a68bf59373e1457c not found: ID does not exist" Apr 16 22:28:04.331515 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:04.331489 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875fbjgqx"] Apr 16 22:28:06.131222 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:06.131183 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f27e2ecd-4776-4122-858f-dc3c740648d5" path="/var/lib/kubelet/pods/f27e2ecd-4776-4122-858f-dc3c740648d5/volumes" Apr 16 22:28:08.879263 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.879229 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js"] Apr 16 22:28:08.879692 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.879669 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77f52587-ec6d-497c-bd37-a6d2ca48f5c2" containerName="main" Apr 16 22:28:08.879692 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.879688 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="77f52587-ec6d-497c-bd37-a6d2ca48f5c2" containerName="main" Apr 16 22:28:08.879787 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.879702 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f27e2ecd-4776-4122-858f-dc3c740648d5" containerName="storage-initializer" Apr 16 22:28:08.879787 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.879710 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27e2ecd-4776-4122-858f-dc3c740648d5" containerName="storage-initializer" Apr 16 22:28:08.879787 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.879720 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f27e2ecd-4776-4122-858f-dc3c740648d5" containerName="tokenizer" Apr 16 22:28:08.879787 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.879726 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27e2ecd-4776-4122-858f-dc3c740648d5" containerName="tokenizer" Apr 16 22:28:08.879787 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.879736 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77f52587-ec6d-497c-bd37-a6d2ca48f5c2" containerName="storage-initializer" Apr 16 22:28:08.879787 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.879742 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="77f52587-ec6d-497c-bd37-a6d2ca48f5c2" containerName="storage-initializer" Apr 16 22:28:08.879787 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.879751 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f27e2ecd-4776-4122-858f-dc3c740648d5" containerName="main" Apr 16 22:28:08.879787 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.879756 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27e2ecd-4776-4122-858f-dc3c740648d5" containerName="main" Apr 16 22:28:08.880254 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.879850 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="77f52587-ec6d-497c-bd37-a6d2ca48f5c2" containerName="main" Apr 16 22:28:08.880254 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.879860 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="f27e2ecd-4776-4122-858f-dc3c740648d5" containerName="main" Apr 16 22:28:08.880254 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.879868 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="f27e2ecd-4776-4122-858f-dc3c740648d5" containerName="tokenizer" Apr 16 22:28:08.884680 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.884661 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:08.887796 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.887779 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pt4qs\"" Apr 16 22:28:08.887894 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.887848 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 22:28:08.888095 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.888076 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 22:28:08.888158 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.888077 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:28:08.893721 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.893699 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js"] Apr 16 22:28:08.980307 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.980280 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-97cdbff9-7f4js\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:08.980405 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.980318 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-home\") pod \"precise-prefix-cache-test-kserve-97cdbff9-7f4js\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:08.980405 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.980339 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/80673b65-eb15-404a-a78e-b64e61cbc84b-tls-certs\") pod \"precise-prefix-cache-test-kserve-97cdbff9-7f4js\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:08.980405 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.980356 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nmbr\" (UniqueName: \"kubernetes.io/projected/80673b65-eb15-404a-a78e-b64e61cbc84b-kube-api-access-7nmbr\") pod \"precise-prefix-cache-test-kserve-97cdbff9-7f4js\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:08.980405 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.980390 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-model-cache\") pod \"precise-prefix-cache-test-kserve-97cdbff9-7f4js\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:08.980539 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:08.980410 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-dshm\") pod \"precise-prefix-cache-test-kserve-97cdbff9-7f4js\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:09.081498 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.081470 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-dshm\") pod \"precise-prefix-cache-test-kserve-97cdbff9-7f4js\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:09.081635 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.081550 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-97cdbff9-7f4js\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:09.081702 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.081640 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-home\") pod \"precise-prefix-cache-test-kserve-97cdbff9-7f4js\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:09.081702 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.081677 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/80673b65-eb15-404a-a78e-b64e61cbc84b-tls-certs\") pod \"precise-prefix-cache-test-kserve-97cdbff9-7f4js\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:09.081799 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.081703 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nmbr\" (UniqueName: \"kubernetes.io/projected/80673b65-eb15-404a-a78e-b64e61cbc84b-kube-api-access-7nmbr\") pod \"precise-prefix-cache-test-kserve-97cdbff9-7f4js\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:09.081866 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.081847 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-model-cache\") pod \"precise-prefix-cache-test-kserve-97cdbff9-7f4js\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:09.082175 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.082124 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-97cdbff9-7f4js\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:09.082285 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.082182 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-model-cache\") pod \"precise-prefix-cache-test-kserve-97cdbff9-7f4js\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:09.082285 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.082270 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-home\") pod \"precise-prefix-cache-test-kserve-97cdbff9-7f4js\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:09.084311 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.084287 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-dshm\") pod \"precise-prefix-cache-test-kserve-97cdbff9-7f4js\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:09.084551 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.084536 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/80673b65-eb15-404a-a78e-b64e61cbc84b-tls-certs\") pod \"precise-prefix-cache-test-kserve-97cdbff9-7f4js\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:09.097547 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.097526 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nmbr\" (UniqueName: \"kubernetes.io/projected/80673b65-eb15-404a-a78e-b64e61cbc84b-kube-api-access-7nmbr\") pod \"precise-prefix-cache-test-kserve-97cdbff9-7f4js\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:09.107751 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.107727 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx"] Apr 16 22:28:09.114807 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.114789 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.117045 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.117023 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-8n7z5\"" Apr 16 22:28:09.121318 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.121292 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx"] Apr 16 22:28:09.183128 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.183059 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdblb\" (UniqueName: \"kubernetes.io/projected/7bbfe839-7744-4859-988d-f2a313b82cce-kube-api-access-sdblb\") pod \"precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.183253 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.183126 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.183253 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.183181 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.183253 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.183213 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.183253 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.183246 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bbfe839-7744-4859-988d-f2a313b82cce-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.183467 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.183280 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.196044 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.196019 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:09.284238 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.284152 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.284238 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.284199 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bbfe839-7744-4859-988d-f2a313b82cce-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.284444 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.284334 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.284444 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.284405 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdblb\" (UniqueName: \"kubernetes.io/projected/7bbfe839-7744-4859-988d-f2a313b82cce-kube-api-access-sdblb\") pod \"precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.284551 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.284482 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.284551 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.284542 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.284683 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.284637 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.284683 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.284674 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.284922 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.284900 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.285018 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.284966 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.287032 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.287001 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bbfe839-7744-4859-988d-f2a313b82cce-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.291962 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.291941 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdblb\" (UniqueName: \"kubernetes.io/projected/7bbfe839-7744-4859-988d-f2a313b82cce-kube-api-access-sdblb\") pod \"precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.321351 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.321322 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js"] Apr 16 22:28:09.322586 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:28:09.322556 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80673b65_eb15_404a_a78e_b64e61cbc84b.slice/crio-cbbea9100cfa81535a5823a03f8d4a7f610a207de0922a92cd3f2c9645a620c7 WatchSource:0}: Error finding container cbbea9100cfa81535a5823a03f8d4a7f610a207de0922a92cd3f2c9645a620c7: Status 404 returned error can't find the container with id cbbea9100cfa81535a5823a03f8d4a7f610a207de0922a92cd3f2c9645a620c7 Apr 16 22:28:09.426793 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.426764 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:09.553834 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:09.553449 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx"] Apr 16 22:28:09.558635 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:28:09.558583 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bbfe839_7744_4859_988d_f2a313b82cce.slice/crio-3466041f4876504f562fc74e59c7b43861868bc6c9e5059d550deec27ec3322c WatchSource:0}: Error finding container 3466041f4876504f562fc74e59c7b43861868bc6c9e5059d550deec27ec3322c: Status 404 returned error can't find the container with id 3466041f4876504f562fc74e59c7b43861868bc6c9e5059d550deec27ec3322c Apr 16 22:28:10.331084 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:10.330941 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" event={"ID":"7bbfe839-7744-4859-988d-f2a313b82cce","Type":"ContainerStarted","Data":"acf9e030e41e73fffd66c9602e1c880f037eb365367c103b56608d3ba416f7d3"} Apr 16 22:28:10.331084 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:10.331000 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" event={"ID":"7bbfe839-7744-4859-988d-f2a313b82cce","Type":"ContainerStarted","Data":"3466041f4876504f562fc74e59c7b43861868bc6c9e5059d550deec27ec3322c"} Apr 16 22:28:10.332587 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:10.332550 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" event={"ID":"80673b65-eb15-404a-a78e-b64e61cbc84b","Type":"ContainerStarted","Data":"de87c97ea0289d6d246554b913163586ae5ee5637d528ec428ac84bd72073f5e"} Apr 16 22:28:10.332587 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:10.332587 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" event={"ID":"80673b65-eb15-404a-a78e-b64e61cbc84b","Type":"ContainerStarted","Data":"cbbea9100cfa81535a5823a03f8d4a7f610a207de0922a92cd3f2c9645a620c7"} Apr 16 22:28:11.338136 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:11.338098 2562 generic.go:358] "Generic (PLEG): container finished" podID="7bbfe839-7744-4859-988d-f2a313b82cce" containerID="acf9e030e41e73fffd66c9602e1c880f037eb365367c103b56608d3ba416f7d3" exitCode=0 Apr 16 22:28:11.338661 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:11.338168 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" event={"ID":"7bbfe839-7744-4859-988d-f2a313b82cce","Type":"ContainerDied","Data":"acf9e030e41e73fffd66c9602e1c880f037eb365367c103b56608d3ba416f7d3"} Apr 16 22:28:12.344284 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:12.344248 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" event={"ID":"7bbfe839-7744-4859-988d-f2a313b82cce","Type":"ContainerStarted","Data":"d19abc52b1cd34d12ee91d3c596d7fd9fa1cdb66806be392ca6c35872275c257"} Apr 16 22:28:12.344284 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:12.344286 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" event={"ID":"7bbfe839-7744-4859-988d-f2a313b82cce","Type":"ContainerStarted","Data":"70b59b515037a2150035d4fc5ba45362601c647c3fb1a6a96e271c4617519489"} Apr 16 22:28:12.344850 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:12.344416 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:12.366169 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:12.366111 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" podStartSLOduration=3.36609829 podStartE2EDuration="3.36609829s" podCreationTimestamp="2026-04-16 22:28:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:28:12.362934264 +0000 UTC m=+884.845453402" watchObservedRunningTime="2026-04-16 22:28:12.36609829 +0000 UTC m=+884.848617426" Apr 16 22:28:14.353688 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:14.353653 2562 generic.go:358] "Generic (PLEG): container finished" podID="80673b65-eb15-404a-a78e-b64e61cbc84b" containerID="de87c97ea0289d6d246554b913163586ae5ee5637d528ec428ac84bd72073f5e" exitCode=0 Apr 16 22:28:14.354047 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:14.353729 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" event={"ID":"80673b65-eb15-404a-a78e-b64e61cbc84b","Type":"ContainerDied","Data":"de87c97ea0289d6d246554b913163586ae5ee5637d528ec428ac84bd72073f5e"} Apr 16 22:28:15.359331 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:15.359289 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" event={"ID":"80673b65-eb15-404a-a78e-b64e61cbc84b","Type":"ContainerStarted","Data":"8185ac7efd74d405e9d90295bcad3ad239355f15a681bd4893229b4bd9a66c2e"} Apr 16 22:28:15.380725 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:15.380657 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" podStartSLOduration=7.3806372 podStartE2EDuration="7.3806372s" podCreationTimestamp="2026-04-16 22:28:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:28:15.376596032 +0000 UTC m=+887.859115170" watchObservedRunningTime="2026-04-16 22:28:15.3806372 +0000 UTC m=+887.863156336" Apr 16 22:28:19.196240 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:19.196154 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:19.196240 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:19.196202 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:19.208755 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:19.208728 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:19.387004 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:19.386975 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:19.427587 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:19.427552 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:19.427587 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:19.427588 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:19.428549 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:28:19.428530 2562 logging.go:55] [core] [Channel #35 SubChannel #36]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.45:9003", ServerName: "10.132.0.45:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.45:9003: connect: connection refused" Apr 16 22:28:19.429842 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:19.429814 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:20.380460 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:20.380434 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:20.427928 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:20.427892 2562 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" podUID="7bbfe839-7744-4859-988d-f2a313b82cce" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.45:9003\" within 1s: context deadline exceeded" Apr 16 22:28:29.427850 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:28:29.427812 2562 logging.go:55] [core] [Channel #37 SubChannel #38]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.45:9003", ServerName: "10.132.0.45:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.45:9003: connect: connection refused" Apr 16 22:28:30.428163 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:30.428112 2562 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" podUID="7bbfe839-7744-4859-988d-f2a313b82cce" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.45:9003\" within 1s: context deadline exceeded" Apr 16 22:28:41.384826 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:41.384794 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:42.599408 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.599361 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js"] Apr 16 22:28:42.599883 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.599721 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" podUID="80673b65-eb15-404a-a78e-b64e61cbc84b" containerName="main" containerID="cri-o://8185ac7efd74d405e9d90295bcad3ad239355f15a681bd4893229b4bd9a66c2e" gracePeriod=30 Apr 16 22:28:42.606542 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.606264 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx"] Apr 16 22:28:42.608441 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.606591 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" podUID="7bbfe839-7744-4859-988d-f2a313b82cce" containerName="main" containerID="cri-o://70b59b515037a2150035d4fc5ba45362601c647c3fb1a6a96e271c4617519489" gracePeriod=30 Apr 16 22:28:42.608441 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.606668 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" podUID="7bbfe839-7744-4859-988d-f2a313b82cce" containerName="tokenizer" containerID="cri-o://d19abc52b1cd34d12ee91d3c596d7fd9fa1cdb66806be392ca6c35872275c257" gracePeriod=30 Apr 16 22:28:42.858920 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.858856 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:42.968059 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.968027 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/80673b65-eb15-404a-a78e-b64e61cbc84b-tls-certs\") pod \"80673b65-eb15-404a-a78e-b64e61cbc84b\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " Apr 16 22:28:42.968059 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.968060 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nmbr\" (UniqueName: \"kubernetes.io/projected/80673b65-eb15-404a-a78e-b64e61cbc84b-kube-api-access-7nmbr\") pod \"80673b65-eb15-404a-a78e-b64e61cbc84b\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " Apr 16 22:28:42.968267 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.968091 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-dshm\") pod \"80673b65-eb15-404a-a78e-b64e61cbc84b\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " Apr 16 22:28:42.968267 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.968122 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-model-cache\") pod \"80673b65-eb15-404a-a78e-b64e61cbc84b\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " Apr 16 22:28:42.968267 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.968153 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-home\") pod \"80673b65-eb15-404a-a78e-b64e61cbc84b\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " Apr 16 22:28:42.968267 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.968195 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-kserve-provision-location\") pod \"80673b65-eb15-404a-a78e-b64e61cbc84b\" (UID: \"80673b65-eb15-404a-a78e-b64e61cbc84b\") " Apr 16 22:28:42.968475 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.968336 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-model-cache" (OuterVolumeSpecName: "model-cache") pod "80673b65-eb15-404a-a78e-b64e61cbc84b" (UID: "80673b65-eb15-404a-a78e-b64e61cbc84b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:42.968475 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.968387 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-home" (OuterVolumeSpecName: "home") pod "80673b65-eb15-404a-a78e-b64e61cbc84b" (UID: "80673b65-eb15-404a-a78e-b64e61cbc84b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:42.968587 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.968492 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-model-cache\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:28:42.968587 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.968510 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-home\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:28:42.970764 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.970740 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-dshm" (OuterVolumeSpecName: "dshm") pod "80673b65-eb15-404a-a78e-b64e61cbc84b" (UID: "80673b65-eb15-404a-a78e-b64e61cbc84b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:42.970871 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.970776 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80673b65-eb15-404a-a78e-b64e61cbc84b-kube-api-access-7nmbr" (OuterVolumeSpecName: "kube-api-access-7nmbr") pod "80673b65-eb15-404a-a78e-b64e61cbc84b" (UID: "80673b65-eb15-404a-a78e-b64e61cbc84b"). InnerVolumeSpecName "kube-api-access-7nmbr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:28:42.970871 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:42.970789 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80673b65-eb15-404a-a78e-b64e61cbc84b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "80673b65-eb15-404a-a78e-b64e61cbc84b" (UID: "80673b65-eb15-404a-a78e-b64e61cbc84b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:28:43.024129 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.024093 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "80673b65-eb15-404a-a78e-b64e61cbc84b" (UID: "80673b65-eb15-404a-a78e-b64e61cbc84b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:43.069350 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.069314 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/80673b65-eb15-404a-a78e-b64e61cbc84b-tls-certs\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:28:43.069350 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.069345 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7nmbr\" (UniqueName: \"kubernetes.io/projected/80673b65-eb15-404a-a78e-b64e61cbc84b-kube-api-access-7nmbr\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:28:43.069350 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.069357 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-dshm\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:28:43.069572 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.069368 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/80673b65-eb15-404a-a78e-b64e61cbc84b-kserve-provision-location\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:28:43.465713 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.465679 2562 generic.go:358] "Generic (PLEG): container finished" podID="7bbfe839-7744-4859-988d-f2a313b82cce" containerID="70b59b515037a2150035d4fc5ba45362601c647c3fb1a6a96e271c4617519489" exitCode=0 Apr 16 22:28:43.465906 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.465758 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" event={"ID":"7bbfe839-7744-4859-988d-f2a313b82cce","Type":"ContainerDied","Data":"70b59b515037a2150035d4fc5ba45362601c647c3fb1a6a96e271c4617519489"} Apr 16 22:28:43.467189 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.467163 2562 generic.go:358] "Generic (PLEG): container finished" podID="80673b65-eb15-404a-a78e-b64e61cbc84b" containerID="8185ac7efd74d405e9d90295bcad3ad239355f15a681bd4893229b4bd9a66c2e" exitCode=0 Apr 16 22:28:43.467327 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.467220 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" event={"ID":"80673b65-eb15-404a-a78e-b64e61cbc84b","Type":"ContainerDied","Data":"8185ac7efd74d405e9d90295bcad3ad239355f15a681bd4893229b4bd9a66c2e"} Apr 16 22:28:43.467327 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.467226 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" Apr 16 22:28:43.467327 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.467243 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js" event={"ID":"80673b65-eb15-404a-a78e-b64e61cbc84b","Type":"ContainerDied","Data":"cbbea9100cfa81535a5823a03f8d4a7f610a207de0922a92cd3f2c9645a620c7"} Apr 16 22:28:43.467327 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.467258 2562 scope.go:117] "RemoveContainer" containerID="8185ac7efd74d405e9d90295bcad3ad239355f15a681bd4893229b4bd9a66c2e" Apr 16 22:28:43.476701 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.476681 2562 scope.go:117] "RemoveContainer" containerID="de87c97ea0289d6d246554b913163586ae5ee5637d528ec428ac84bd72073f5e" Apr 16 22:28:43.490702 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.490675 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js"] Apr 16 22:28:43.492071 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.492047 2562 scope.go:117] "RemoveContainer" containerID="8185ac7efd74d405e9d90295bcad3ad239355f15a681bd4893229b4bd9a66c2e" Apr 16 22:28:43.492457 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:28:43.492403 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8185ac7efd74d405e9d90295bcad3ad239355f15a681bd4893229b4bd9a66c2e\": container with ID starting with 8185ac7efd74d405e9d90295bcad3ad239355f15a681bd4893229b4bd9a66c2e not found: ID does not exist" containerID="8185ac7efd74d405e9d90295bcad3ad239355f15a681bd4893229b4bd9a66c2e" Apr 16 22:28:43.492572 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.492457 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8185ac7efd74d405e9d90295bcad3ad239355f15a681bd4893229b4bd9a66c2e"} err="failed to get container status \"8185ac7efd74d405e9d90295bcad3ad239355f15a681bd4893229b4bd9a66c2e\": rpc error: code = NotFound desc = could not find container \"8185ac7efd74d405e9d90295bcad3ad239355f15a681bd4893229b4bd9a66c2e\": container with ID starting with 8185ac7efd74d405e9d90295bcad3ad239355f15a681bd4893229b4bd9a66c2e not found: ID does not exist" Apr 16 22:28:43.492572 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.492475 2562 scope.go:117] "RemoveContainer" containerID="de87c97ea0289d6d246554b913163586ae5ee5637d528ec428ac84bd72073f5e" Apr 16 22:28:43.492811 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:28:43.492779 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de87c97ea0289d6d246554b913163586ae5ee5637d528ec428ac84bd72073f5e\": container with ID starting with de87c97ea0289d6d246554b913163586ae5ee5637d528ec428ac84bd72073f5e not found: ID does not exist" containerID="de87c97ea0289d6d246554b913163586ae5ee5637d528ec428ac84bd72073f5e" Apr 16 22:28:43.492870 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.492820 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de87c97ea0289d6d246554b913163586ae5ee5637d528ec428ac84bd72073f5e"} err="failed to get container status \"de87c97ea0289d6d246554b913163586ae5ee5637d528ec428ac84bd72073f5e\": rpc error: code = NotFound desc = could not find container \"de87c97ea0289d6d246554b913163586ae5ee5637d528ec428ac84bd72073f5e\": container with ID starting with de87c97ea0289d6d246554b913163586ae5ee5637d528ec428ac84bd72073f5e not found: ID does not exist" Apr 16 22:28:43.494527 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:43.494505 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-97cdbff9-7f4js"] Apr 16 22:28:44.043853 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.043834 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:44.130838 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.130773 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80673b65-eb15-404a-a78e-b64e61cbc84b" path="/var/lib/kubelet/pods/80673b65-eb15-404a-a78e-b64e61cbc84b/volumes" Apr 16 22:28:44.178681 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.178656 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-tokenizer-tmp\") pod \"7bbfe839-7744-4859-988d-f2a313b82cce\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " Apr 16 22:28:44.178810 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.178696 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-kserve-provision-location\") pod \"7bbfe839-7744-4859-988d-f2a313b82cce\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " Apr 16 22:28:44.178810 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.178715 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-tokenizer-uds\") pod \"7bbfe839-7744-4859-988d-f2a313b82cce\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " Apr 16 22:28:44.178810 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.178738 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdblb\" (UniqueName: \"kubernetes.io/projected/7bbfe839-7744-4859-988d-f2a313b82cce-kube-api-access-sdblb\") pod \"7bbfe839-7744-4859-988d-f2a313b82cce\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " Apr 16 22:28:44.178810 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.178781 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bbfe839-7744-4859-988d-f2a313b82cce-tls-certs\") pod \"7bbfe839-7744-4859-988d-f2a313b82cce\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " Apr 16 22:28:44.179027 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.178842 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-tokenizer-cache\") pod \"7bbfe839-7744-4859-988d-f2a313b82cce\" (UID: \"7bbfe839-7744-4859-988d-f2a313b82cce\") " Apr 16 22:28:44.179027 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.178977 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "7bbfe839-7744-4859-988d-f2a313b82cce" (UID: "7bbfe839-7744-4859-988d-f2a313b82cce"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:44.179027 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.179012 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "7bbfe839-7744-4859-988d-f2a313b82cce" (UID: "7bbfe839-7744-4859-988d-f2a313b82cce"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:44.179172 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.179109 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-tokenizer-uds\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:28:44.179273 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.179248 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "7bbfe839-7744-4859-988d-f2a313b82cce" (UID: "7bbfe839-7744-4859-988d-f2a313b82cce"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:44.179452 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.179434 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7bbfe839-7744-4859-988d-f2a313b82cce" (UID: "7bbfe839-7744-4859-988d-f2a313b82cce"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:44.180850 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.180832 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bbfe839-7744-4859-988d-f2a313b82cce-kube-api-access-sdblb" (OuterVolumeSpecName: "kube-api-access-sdblb") pod "7bbfe839-7744-4859-988d-f2a313b82cce" (UID: "7bbfe839-7744-4859-988d-f2a313b82cce"). InnerVolumeSpecName "kube-api-access-sdblb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:28:44.181038 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.181023 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbfe839-7744-4859-988d-f2a313b82cce-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7bbfe839-7744-4859-988d-f2a313b82cce" (UID: "7bbfe839-7744-4859-988d-f2a313b82cce"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:28:44.279466 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.279444 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bbfe839-7744-4859-988d-f2a313b82cce-tls-certs\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:28:44.279466 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.279464 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-tokenizer-cache\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:28:44.279624 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.279475 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-tokenizer-tmp\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:28:44.279624 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.279485 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bbfe839-7744-4859-988d-f2a313b82cce-kserve-provision-location\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:28:44.279624 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.279494 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sdblb\" (UniqueName: \"kubernetes.io/projected/7bbfe839-7744-4859-988d-f2a313b82cce-kube-api-access-sdblb\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:28:44.473389 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.473298 2562 generic.go:358] "Generic (PLEG): container finished" podID="7bbfe839-7744-4859-988d-f2a313b82cce" containerID="d19abc52b1cd34d12ee91d3c596d7fd9fa1cdb66806be392ca6c35872275c257" exitCode=0 Apr 16 22:28:44.473389 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.473330 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" event={"ID":"7bbfe839-7744-4859-988d-f2a313b82cce","Type":"ContainerDied","Data":"d19abc52b1cd34d12ee91d3c596d7fd9fa1cdb66806be392ca6c35872275c257"} Apr 16 22:28:44.473389 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.473371 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" event={"ID":"7bbfe839-7744-4859-988d-f2a313b82cce","Type":"ContainerDied","Data":"3466041f4876504f562fc74e59c7b43861868bc6c9e5059d550deec27ec3322c"} Apr 16 22:28:44.473389 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.473381 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx" Apr 16 22:28:44.473714 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.473389 2562 scope.go:117] "RemoveContainer" containerID="d19abc52b1cd34d12ee91d3c596d7fd9fa1cdb66806be392ca6c35872275c257" Apr 16 22:28:44.483542 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.483528 2562 scope.go:117] "RemoveContainer" containerID="70b59b515037a2150035d4fc5ba45362601c647c3fb1a6a96e271c4617519489" Apr 16 22:28:44.491277 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.491257 2562 scope.go:117] "RemoveContainer" containerID="acf9e030e41e73fffd66c9602e1c880f037eb365367c103b56608d3ba416f7d3" Apr 16 22:28:44.496951 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.496929 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx"] Apr 16 22:28:44.499143 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.499122 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-776dbc46pz9vx"] Apr 16 22:28:44.500250 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.500234 2562 scope.go:117] "RemoveContainer" containerID="d19abc52b1cd34d12ee91d3c596d7fd9fa1cdb66806be392ca6c35872275c257" Apr 16 22:28:44.500504 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:28:44.500483 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d19abc52b1cd34d12ee91d3c596d7fd9fa1cdb66806be392ca6c35872275c257\": container with ID starting with d19abc52b1cd34d12ee91d3c596d7fd9fa1cdb66806be392ca6c35872275c257 not found: ID does not exist" containerID="d19abc52b1cd34d12ee91d3c596d7fd9fa1cdb66806be392ca6c35872275c257" Apr 16 22:28:44.500571 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.500510 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d19abc52b1cd34d12ee91d3c596d7fd9fa1cdb66806be392ca6c35872275c257"} err="failed to get container status \"d19abc52b1cd34d12ee91d3c596d7fd9fa1cdb66806be392ca6c35872275c257\": rpc error: code = NotFound desc = could not find container \"d19abc52b1cd34d12ee91d3c596d7fd9fa1cdb66806be392ca6c35872275c257\": container with ID starting with d19abc52b1cd34d12ee91d3c596d7fd9fa1cdb66806be392ca6c35872275c257 not found: ID does not exist" Apr 16 22:28:44.500571 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.500539 2562 scope.go:117] "RemoveContainer" containerID="70b59b515037a2150035d4fc5ba45362601c647c3fb1a6a96e271c4617519489" Apr 16 22:28:44.500767 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:28:44.500749 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70b59b515037a2150035d4fc5ba45362601c647c3fb1a6a96e271c4617519489\": container with ID starting with 70b59b515037a2150035d4fc5ba45362601c647c3fb1a6a96e271c4617519489 not found: ID does not exist" containerID="70b59b515037a2150035d4fc5ba45362601c647c3fb1a6a96e271c4617519489" Apr 16 22:28:44.500812 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.500771 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70b59b515037a2150035d4fc5ba45362601c647c3fb1a6a96e271c4617519489"} err="failed to get container status \"70b59b515037a2150035d4fc5ba45362601c647c3fb1a6a96e271c4617519489\": rpc error: code = NotFound desc = could not find container \"70b59b515037a2150035d4fc5ba45362601c647c3fb1a6a96e271c4617519489\": container with ID starting with 70b59b515037a2150035d4fc5ba45362601c647c3fb1a6a96e271c4617519489 not found: ID does not exist" Apr 16 22:28:44.500812 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.500782 2562 scope.go:117] "RemoveContainer" containerID="acf9e030e41e73fffd66c9602e1c880f037eb365367c103b56608d3ba416f7d3" Apr 16 22:28:44.501001 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:28:44.500983 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acf9e030e41e73fffd66c9602e1c880f037eb365367c103b56608d3ba416f7d3\": container with ID starting with acf9e030e41e73fffd66c9602e1c880f037eb365367c103b56608d3ba416f7d3 not found: ID does not exist" containerID="acf9e030e41e73fffd66c9602e1c880f037eb365367c103b56608d3ba416f7d3" Apr 16 22:28:44.501038 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:44.501004 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acf9e030e41e73fffd66c9602e1c880f037eb365367c103b56608d3ba416f7d3"} err="failed to get container status \"acf9e030e41e73fffd66c9602e1c880f037eb365367c103b56608d3ba416f7d3\": rpc error: code = NotFound desc = could not find container \"acf9e030e41e73fffd66c9602e1c880f037eb365367c103b56608d3ba416f7d3\": container with ID starting with acf9e030e41e73fffd66c9602e1c880f037eb365367c103b56608d3ba416f7d3 not found: ID does not exist" Apr 16 22:28:46.130702 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:46.130666 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bbfe839-7744-4859-988d-f2a313b82cce" path="/var/lib/kubelet/pods/7bbfe839-7744-4859-988d-f2a313b82cce/volumes" Apr 16 22:28:55.091540 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.091494 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks"] Apr 16 22:28:55.091936 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.091880 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80673b65-eb15-404a-a78e-b64e61cbc84b" containerName="storage-initializer" Apr 16 22:28:55.091936 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.091893 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="80673b65-eb15-404a-a78e-b64e61cbc84b" containerName="storage-initializer" Apr 16 22:28:55.091936 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.091915 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80673b65-eb15-404a-a78e-b64e61cbc84b" containerName="main" Apr 16 22:28:55.091936 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.091921 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="80673b65-eb15-404a-a78e-b64e61cbc84b" containerName="main" Apr 16 22:28:55.091936 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.091930 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bbfe839-7744-4859-988d-f2a313b82cce" containerName="main" Apr 16 22:28:55.091936 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.091938 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbfe839-7744-4859-988d-f2a313b82cce" containerName="main" Apr 16 22:28:55.092134 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.091948 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bbfe839-7744-4859-988d-f2a313b82cce" containerName="storage-initializer" Apr 16 22:28:55.092134 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.091953 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbfe839-7744-4859-988d-f2a313b82cce" containerName="storage-initializer" Apr 16 22:28:55.092134 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.091964 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bbfe839-7744-4859-988d-f2a313b82cce" containerName="tokenizer" Apr 16 22:28:55.092134 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.091969 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbfe839-7744-4859-988d-f2a313b82cce" containerName="tokenizer" Apr 16 22:28:55.092134 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.092024 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="7bbfe839-7744-4859-988d-f2a313b82cce" containerName="main" Apr 16 22:28:55.092134 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.092033 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="80673b65-eb15-404a-a78e-b64e61cbc84b" containerName="main" Apr 16 22:28:55.092134 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.092043 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="7bbfe839-7744-4859-988d-f2a313b82cce" containerName="tokenizer" Apr 16 22:28:55.094115 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.094083 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.096435 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.096414 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:28:55.097301 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.097269 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pt4qs\"" Apr 16 22:28:55.097412 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.097313 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 22:28:55.097412 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.097321 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 22:28:55.097551 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.097269 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-9kjlt\"" Apr 16 22:28:55.106820 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.106795 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks"] Apr 16 22:28:55.269545 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.269509 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzqgs\" (UniqueName: \"kubernetes.io/projected/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-kube-api-access-nzqgs\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.269760 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.269553 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.269760 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.269577 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.269882 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.269833 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.269940 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.269921 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.269993 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.269951 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.371139 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.371037 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.371139 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.371085 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.371378 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.371152 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.371378 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.371202 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.371378 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.371225 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.371378 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.371256 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzqgs\" (UniqueName: \"kubernetes.io/projected/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-kube-api-access-nzqgs\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.371634 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.371491 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.371634 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.371532 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.371634 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.371561 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.371634 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.371586 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.373667 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.373647 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.380022 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.379995 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzqgs\" (UniqueName: \"kubernetes.io/projected/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-kube-api-access-nzqgs\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.405994 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.405964 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:55.533659 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:55.533626 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks"] Apr 16 22:28:55.535857 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:28:55.535783 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod760ad4d6_0ab8_4c01_ab16_d283af6fb4d2.slice/crio-20280a0a904da1bc7aa8ee36b91eeb3103d41aac398160cb8fe7a54cdf133526 WatchSource:0}: Error finding container 20280a0a904da1bc7aa8ee36b91eeb3103d41aac398160cb8fe7a54cdf133526: Status 404 returned error can't find the container with id 20280a0a904da1bc7aa8ee36b91eeb3103d41aac398160cb8fe7a54cdf133526 Apr 16 22:28:56.521266 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:56.521236 2562 generic.go:358] "Generic (PLEG): container finished" podID="760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" containerID="913c8bb1dfc4205732c156e33f8b7b45e58f14c693f3680a2378873c998026a6" exitCode=0 Apr 16 22:28:56.521652 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:56.521315 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" event={"ID":"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2","Type":"ContainerDied","Data":"913c8bb1dfc4205732c156e33f8b7b45e58f14c693f3680a2378873c998026a6"} Apr 16 22:28:56.521652 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:56.521347 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" event={"ID":"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2","Type":"ContainerStarted","Data":"20280a0a904da1bc7aa8ee36b91eeb3103d41aac398160cb8fe7a54cdf133526"} Apr 16 22:28:57.527684 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:57.527649 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" event={"ID":"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2","Type":"ContainerStarted","Data":"0911a21711545abe52025d8333db1c1e7a58616af21f1e888dde179b4d7685b9"} Apr 16 22:28:57.527684 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:57.527683 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" event={"ID":"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2","Type":"ContainerStarted","Data":"d02cd9c7a184431fec0c82b9e2bed33696856b1d16f45839140251ddc11b31fc"} Apr 16 22:28:57.528206 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:57.527760 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:28:57.550770 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:28:57.550708 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" podStartSLOduration=2.550686807 podStartE2EDuration="2.550686807s" podCreationTimestamp="2026-04-16 22:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:28:57.547039534 +0000 UTC m=+930.029558668" watchObservedRunningTime="2026-04-16 22:28:57.550686807 +0000 UTC m=+930.033205944" Apr 16 22:29:05.406839 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:29:05.406805 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:29:05.406839 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:29:05.406846 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:29:05.409436 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:29:05.409411 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:29:05.560654 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:29:05.560598 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:29:26.566025 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:29:26.565990 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:31:05.356647 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:05.356588 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks"] Apr 16 22:31:05.357228 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:05.356957 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" podUID="760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" containerName="main" containerID="cri-o://d02cd9c7a184431fec0c82b9e2bed33696856b1d16f45839140251ddc11b31fc" gracePeriod=30 Apr 16 22:31:05.357228 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:05.357047 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" podUID="760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" containerName="tokenizer" containerID="cri-o://0911a21711545abe52025d8333db1c1e7a58616af21f1e888dde179b4d7685b9" gracePeriod=30 Apr 16 22:31:05.559767 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:05.559727 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" podUID="760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.46:8082/healthz\": dial tcp 10.132.0.46:8082: connect: connection refused" Apr 16 22:31:06.026662 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.026624 2562 generic.go:358] "Generic (PLEG): container finished" podID="760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" containerID="d02cd9c7a184431fec0c82b9e2bed33696856b1d16f45839140251ddc11b31fc" exitCode=0 Apr 16 22:31:06.026662 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.026635 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" event={"ID":"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2","Type":"ContainerDied","Data":"d02cd9c7a184431fec0c82b9e2bed33696856b1d16f45839140251ddc11b31fc"} Apr 16 22:31:06.600235 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.600215 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:31:06.747576 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.747489 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tokenizer-uds\") pod \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " Apr 16 22:31:06.747576 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.747522 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tokenizer-tmp\") pod \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " Apr 16 22:31:06.747576 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.747544 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzqgs\" (UniqueName: \"kubernetes.io/projected/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-kube-api-access-nzqgs\") pod \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " Apr 16 22:31:06.747884 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.747587 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tls-certs\") pod \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " Apr 16 22:31:06.747884 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.747780 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tokenizer-cache\") pod \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " Apr 16 22:31:06.747884 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.747832 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" (UID: "760ad4d6-0ab8-4c01-ab16-d283af6fb4d2"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:31:06.748037 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.747897 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-kserve-provision-location\") pod \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\" (UID: \"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2\") " Apr 16 22:31:06.748037 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.747922 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" (UID: "760ad4d6-0ab8-4c01-ab16-d283af6fb4d2"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:31:06.748136 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.748044 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" (UID: "760ad4d6-0ab8-4c01-ab16-d283af6fb4d2"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:31:06.748189 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.748151 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tokenizer-uds\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:31:06.748189 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.748173 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tokenizer-tmp\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:31:06.748259 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.748189 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tokenizer-cache\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:31:06.748567 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.748548 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" (UID: "760ad4d6-0ab8-4c01-ab16-d283af6fb4d2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:31:06.749693 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.749671 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-kube-api-access-nzqgs" (OuterVolumeSpecName: "kube-api-access-nzqgs") pod "760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" (UID: "760ad4d6-0ab8-4c01-ab16-d283af6fb4d2"). InnerVolumeSpecName "kube-api-access-nzqgs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:31:06.749777 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.749704 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" (UID: "760ad4d6-0ab8-4c01-ab16-d283af6fb4d2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:31:06.849245 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.849208 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-kserve-provision-location\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:31:06.849245 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.849241 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nzqgs\" (UniqueName: \"kubernetes.io/projected/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-kube-api-access-nzqgs\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:31:06.849245 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:06.849251 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2-tls-certs\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:31:07.031762 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:07.031703 2562 generic.go:358] "Generic (PLEG): container finished" podID="760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" containerID="0911a21711545abe52025d8333db1c1e7a58616af21f1e888dde179b4d7685b9" exitCode=0 Apr 16 22:31:07.031858 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:07.031780 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" event={"ID":"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2","Type":"ContainerDied","Data":"0911a21711545abe52025d8333db1c1e7a58616af21f1e888dde179b4d7685b9"} Apr 16 22:31:07.031858 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:07.031817 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" event={"ID":"760ad4d6-0ab8-4c01-ab16-d283af6fb4d2","Type":"ContainerDied","Data":"20280a0a904da1bc7aa8ee36b91eeb3103d41aac398160cb8fe7a54cdf133526"} Apr 16 22:31:07.031858 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:07.031834 2562 scope.go:117] "RemoveContainer" containerID="0911a21711545abe52025d8333db1c1e7a58616af21f1e888dde179b4d7685b9" Apr 16 22:31:07.031977 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:07.031787 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" Apr 16 22:31:07.041160 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:07.041141 2562 scope.go:117] "RemoveContainer" containerID="d02cd9c7a184431fec0c82b9e2bed33696856b1d16f45839140251ddc11b31fc" Apr 16 22:31:07.048837 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:07.048822 2562 scope.go:117] "RemoveContainer" containerID="913c8bb1dfc4205732c156e33f8b7b45e58f14c693f3680a2378873c998026a6" Apr 16 22:31:07.054763 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:07.054739 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks"] Apr 16 22:31:07.057256 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:07.057237 2562 scope.go:117] "RemoveContainer" containerID="0911a21711545abe52025d8333db1c1e7a58616af21f1e888dde179b4d7685b9" Apr 16 22:31:07.057538 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:31:07.057516 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0911a21711545abe52025d8333db1c1e7a58616af21f1e888dde179b4d7685b9\": container with ID starting with 0911a21711545abe52025d8333db1c1e7a58616af21f1e888dde179b4d7685b9 not found: ID does not exist" containerID="0911a21711545abe52025d8333db1c1e7a58616af21f1e888dde179b4d7685b9" Apr 16 22:31:07.057637 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:07.057546 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0911a21711545abe52025d8333db1c1e7a58616af21f1e888dde179b4d7685b9"} err="failed to get container status \"0911a21711545abe52025d8333db1c1e7a58616af21f1e888dde179b4d7685b9\": rpc error: code = NotFound desc = could not find container \"0911a21711545abe52025d8333db1c1e7a58616af21f1e888dde179b4d7685b9\": container with ID starting with 0911a21711545abe52025d8333db1c1e7a58616af21f1e888dde179b4d7685b9 not found: ID does not exist" Apr 16 22:31:07.057637 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:07.057565 2562 scope.go:117] "RemoveContainer" containerID="d02cd9c7a184431fec0c82b9e2bed33696856b1d16f45839140251ddc11b31fc" Apr 16 22:31:07.057861 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:31:07.057844 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d02cd9c7a184431fec0c82b9e2bed33696856b1d16f45839140251ddc11b31fc\": container with ID starting with d02cd9c7a184431fec0c82b9e2bed33696856b1d16f45839140251ddc11b31fc not found: ID does not exist" containerID="d02cd9c7a184431fec0c82b9e2bed33696856b1d16f45839140251ddc11b31fc" Apr 16 22:31:07.057911 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:07.057875 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d02cd9c7a184431fec0c82b9e2bed33696856b1d16f45839140251ddc11b31fc"} err="failed to get container status \"d02cd9c7a184431fec0c82b9e2bed33696856b1d16f45839140251ddc11b31fc\": rpc error: code = NotFound desc = could not find container \"d02cd9c7a184431fec0c82b9e2bed33696856b1d16f45839140251ddc11b31fc\": container with ID starting with d02cd9c7a184431fec0c82b9e2bed33696856b1d16f45839140251ddc11b31fc not found: ID does not exist" Apr 16 22:31:07.057911 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:07.057893 2562 scope.go:117] "RemoveContainer" containerID="913c8bb1dfc4205732c156e33f8b7b45e58f14c693f3680a2378873c998026a6" Apr 16 22:31:07.058105 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:31:07.058090 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"913c8bb1dfc4205732c156e33f8b7b45e58f14c693f3680a2378873c998026a6\": container with ID starting with 913c8bb1dfc4205732c156e33f8b7b45e58f14c693f3680a2378873c998026a6 not found: ID does not exist" containerID="913c8bb1dfc4205732c156e33f8b7b45e58f14c693f3680a2378873c998026a6" Apr 16 22:31:07.058143 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:07.058112 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913c8bb1dfc4205732c156e33f8b7b45e58f14c693f3680a2378873c998026a6"} err="failed to get container status \"913c8bb1dfc4205732c156e33f8b7b45e58f14c693f3680a2378873c998026a6\": rpc error: code = NotFound desc = could not find container \"913c8bb1dfc4205732c156e33f8b7b45e58f14c693f3680a2378873c998026a6\": container with ID starting with 913c8bb1dfc4205732c156e33f8b7b45e58f14c693f3680a2378873c998026a6 not found: ID does not exist" Apr 16 22:31:07.058287 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:07.058269 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks"] Apr 16 22:31:07.564847 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:07.564786 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-nb4ks" podUID="760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.46:9003\" within 1s: context deadline exceeded" Apr 16 22:31:07.564847 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:31:07.564845 2562 logging.go:55] [core] [Channel #111 SubChannel #112]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.46:9003", ServerName: "10.132.0.46:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.46:9003: operation was canceled" Apr 16 22:31:08.130328 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:08.130294 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" path="/var/lib/kubelet/pods/760ad4d6-0ab8-4c01-ab16-d283af6fb4d2/volumes" Apr 16 22:31:32.916777 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.916745 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv"] Apr 16 22:31:32.917155 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.917098 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" containerName="main" Apr 16 22:31:32.917155 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.917111 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" containerName="main" Apr 16 22:31:32.917155 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.917127 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" containerName="tokenizer" Apr 16 22:31:32.917155 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.917133 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" containerName="tokenizer" Apr 16 22:31:32.917155 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.917148 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" containerName="storage-initializer" Apr 16 22:31:32.917155 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.917154 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" containerName="storage-initializer" Apr 16 22:31:32.917364 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.917205 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" containerName="main" Apr 16 22:31:32.917364 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.917215 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="760ad4d6-0ab8-4c01-ab16-d283af6fb4d2" containerName="tokenizer" Apr 16 22:31:32.921361 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.921342 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:32.924011 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.923983 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 22:31:32.924138 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.924105 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:31:32.927974 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.924651 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-rgv2d\"" Apr 16 22:31:32.927974 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.925117 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pt4qs\"" Apr 16 22:31:32.927974 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.925184 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 22:31:32.933934 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.933911 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv"] Apr 16 22:31:32.941164 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.941127 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:32.941164 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.941167 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:32.941346 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.941249 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:32.941346 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.941278 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfjbl\" (UniqueName: \"kubernetes.io/projected/aa6df016-f078-46dc-a601-4bdcdf15c0b9-kube-api-access-qfjbl\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:32.941346 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.941302 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:32.941346 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:32.941330 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:33.042325 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:33.042293 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:33.042461 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:33.042338 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:33.042461 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:33.042386 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:33.042461 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:33.042409 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:33.042651 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:33.042476 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:33.042651 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:33.042505 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfjbl\" (UniqueName: \"kubernetes.io/projected/aa6df016-f078-46dc-a601-4bdcdf15c0b9-kube-api-access-qfjbl\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:33.042812 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:33.042788 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:33.042877 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:33.042820 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:33.042934 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:33.042875 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:33.042934 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:33.042914 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:33.044867 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:33.044851 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:33.051480 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:33.051457 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfjbl\" (UniqueName: \"kubernetes.io/projected/aa6df016-f078-46dc-a601-4bdcdf15c0b9-kube-api-access-qfjbl\") pod \"stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:33.237077 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:33.237002 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:33.377495 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:33.377466 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv"] Apr 16 22:31:33.378481 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:31:33.378457 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa6df016_f078_46dc_a601_4bdcdf15c0b9.slice/crio-675dce4a2a5bc4e881b53eefff47067b0edc637e55229f195d4d12c578919fdf WatchSource:0}: Error finding container 675dce4a2a5bc4e881b53eefff47067b0edc637e55229f195d4d12c578919fdf: Status 404 returned error can't find the container with id 675dce4a2a5bc4e881b53eefff47067b0edc637e55229f195d4d12c578919fdf Apr 16 22:31:33.380708 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:33.380688 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:31:34.143972 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:34.143935 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" event={"ID":"aa6df016-f078-46dc-a601-4bdcdf15c0b9","Type":"ContainerStarted","Data":"15a32cfc1be70c946ea6b8320d1c0ca8f8815846bf845fb211f12d69d58ebd88"} Apr 16 22:31:34.143972 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:34.143971 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" event={"ID":"aa6df016-f078-46dc-a601-4bdcdf15c0b9","Type":"ContainerStarted","Data":"675dce4a2a5bc4e881b53eefff47067b0edc637e55229f195d4d12c578919fdf"} Apr 16 22:31:35.149746 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:35.149710 2562 generic.go:358] "Generic (PLEG): container finished" podID="aa6df016-f078-46dc-a601-4bdcdf15c0b9" containerID="15a32cfc1be70c946ea6b8320d1c0ca8f8815846bf845fb211f12d69d58ebd88" exitCode=0 Apr 16 22:31:35.150207 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:35.149801 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" event={"ID":"aa6df016-f078-46dc-a601-4bdcdf15c0b9","Type":"ContainerDied","Data":"15a32cfc1be70c946ea6b8320d1c0ca8f8815846bf845fb211f12d69d58ebd88"} Apr 16 22:31:36.154783 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:36.154749 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" event={"ID":"aa6df016-f078-46dc-a601-4bdcdf15c0b9","Type":"ContainerStarted","Data":"a9b932005843a6cca8e340dacc25efec1127dc195e20e0aca5f300f493c5ed44"} Apr 16 22:31:36.154783 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:36.154783 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" event={"ID":"aa6df016-f078-46dc-a601-4bdcdf15c0b9","Type":"ContainerStarted","Data":"b3c0084dd0f04677f2a576767592608e8a2cb24be6df85d76ff31d78d03d6827"} Apr 16 22:31:36.155253 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:36.154884 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:36.177570 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:36.177523 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" podStartSLOduration=4.17751022 podStartE2EDuration="4.17751022s" podCreationTimestamp="2026-04-16 22:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:31:36.174300198 +0000 UTC m=+1088.656819333" watchObservedRunningTime="2026-04-16 22:31:36.17751022 +0000 UTC m=+1088.660029354" Apr 16 22:31:43.237459 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:43.237412 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:43.237459 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:43.237468 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:43.240154 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:43.240130 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:31:44.188419 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:31:44.188387 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:32:05.193956 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:32:05.193929 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:33:14.136443 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:14.136411 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv"] Apr 16 22:33:14.136911 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:14.136690 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" podUID="aa6df016-f078-46dc-a601-4bdcdf15c0b9" containerName="main" containerID="cri-o://b3c0084dd0f04677f2a576767592608e8a2cb24be6df85d76ff31d78d03d6827" gracePeriod=30 Apr 16 22:33:14.136911 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:14.136753 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" podUID="aa6df016-f078-46dc-a601-4bdcdf15c0b9" containerName="tokenizer" containerID="cri-o://a9b932005843a6cca8e340dacc25efec1127dc195e20e0aca5f300f493c5ed44" gracePeriod=30 Apr 16 22:33:14.188093 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:14.188062 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" podUID="aa6df016-f078-46dc-a601-4bdcdf15c0b9" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.47:8082/healthz\": dial tcp 10.132.0.47:8082: connect: connection refused" Apr 16 22:33:14.559254 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:14.559208 2562 generic.go:358] "Generic (PLEG): container finished" podID="aa6df016-f078-46dc-a601-4bdcdf15c0b9" containerID="b3c0084dd0f04677f2a576767592608e8a2cb24be6df85d76ff31d78d03d6827" exitCode=0 Apr 16 22:33:14.559422 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:14.559280 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" event={"ID":"aa6df016-f078-46dc-a601-4bdcdf15c0b9","Type":"ContainerDied","Data":"b3c0084dd0f04677f2a576767592608e8a2cb24be6df85d76ff31d78d03d6827"} Apr 16 22:33:15.193076 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:33:15.193051 2562 logging.go:55] [core] [Channel #164 SubChannel #165]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.47:9003", ServerName: "10.132.0.47:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.47:9003: connect: connection refused" Apr 16 22:33:15.288219 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.288197 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:33:15.340264 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.340200 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-kserve-provision-location\") pod \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " Apr 16 22:33:15.340383 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.340271 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tokenizer-cache\") pod \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " Apr 16 22:33:15.340383 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.340301 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tokenizer-tmp\") pod \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " Apr 16 22:33:15.340383 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.340335 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tls-certs\") pod \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " Apr 16 22:33:15.340540 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.340396 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfjbl\" (UniqueName: \"kubernetes.io/projected/aa6df016-f078-46dc-a601-4bdcdf15c0b9-kube-api-access-qfjbl\") pod \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " Apr 16 22:33:15.340540 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.340450 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tokenizer-uds\") pod \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\" (UID: \"aa6df016-f078-46dc-a601-4bdcdf15c0b9\") " Apr 16 22:33:15.340667 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.340544 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "aa6df016-f078-46dc-a601-4bdcdf15c0b9" (UID: "aa6df016-f078-46dc-a601-4bdcdf15c0b9"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:33:15.340760 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.340665 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "aa6df016-f078-46dc-a601-4bdcdf15c0b9" (UID: "aa6df016-f078-46dc-a601-4bdcdf15c0b9"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:33:15.340833 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.340762 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tokenizer-cache\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:33:15.340833 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.340768 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "aa6df016-f078-46dc-a601-4bdcdf15c0b9" (UID: "aa6df016-f078-46dc-a601-4bdcdf15c0b9"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:33:15.340957 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.340926 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "aa6df016-f078-46dc-a601-4bdcdf15c0b9" (UID: "aa6df016-f078-46dc-a601-4bdcdf15c0b9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:33:15.342448 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.342428 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "aa6df016-f078-46dc-a601-4bdcdf15c0b9" (UID: "aa6df016-f078-46dc-a601-4bdcdf15c0b9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:33:15.342528 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.342491 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa6df016-f078-46dc-a601-4bdcdf15c0b9-kube-api-access-qfjbl" (OuterVolumeSpecName: "kube-api-access-qfjbl") pod "aa6df016-f078-46dc-a601-4bdcdf15c0b9" (UID: "aa6df016-f078-46dc-a601-4bdcdf15c0b9"). InnerVolumeSpecName "kube-api-access-qfjbl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:33:15.441823 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.441797 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tokenizer-uds\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:33:15.441929 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.441824 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-kserve-provision-location\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:33:15.441929 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.441840 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tokenizer-tmp\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:33:15.441929 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.441854 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aa6df016-f078-46dc-a601-4bdcdf15c0b9-tls-certs\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:33:15.441929 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.441866 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qfjbl\" (UniqueName: \"kubernetes.io/projected/aa6df016-f078-46dc-a601-4bdcdf15c0b9-kube-api-access-qfjbl\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:33:15.564971 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.564925 2562 generic.go:358] "Generic (PLEG): container finished" podID="aa6df016-f078-46dc-a601-4bdcdf15c0b9" containerID="a9b932005843a6cca8e340dacc25efec1127dc195e20e0aca5f300f493c5ed44" exitCode=0 Apr 16 22:33:15.565096 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.565022 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" Apr 16 22:33:15.565164 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.565021 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" event={"ID":"aa6df016-f078-46dc-a601-4bdcdf15c0b9","Type":"ContainerDied","Data":"a9b932005843a6cca8e340dacc25efec1127dc195e20e0aca5f300f493c5ed44"} Apr 16 22:33:15.565164 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.565146 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" event={"ID":"aa6df016-f078-46dc-a601-4bdcdf15c0b9","Type":"ContainerDied","Data":"675dce4a2a5bc4e881b53eefff47067b0edc637e55229f195d4d12c578919fdf"} Apr 16 22:33:15.565269 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.565173 2562 scope.go:117] "RemoveContainer" containerID="a9b932005843a6cca8e340dacc25efec1127dc195e20e0aca5f300f493c5ed44" Apr 16 22:33:15.575456 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.575436 2562 scope.go:117] "RemoveContainer" containerID="b3c0084dd0f04677f2a576767592608e8a2cb24be6df85d76ff31d78d03d6827" Apr 16 22:33:15.583416 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.583397 2562 scope.go:117] "RemoveContainer" containerID="15a32cfc1be70c946ea6b8320d1c0ca8f8815846bf845fb211f12d69d58ebd88" Apr 16 22:33:15.591135 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.591116 2562 scope.go:117] "RemoveContainer" containerID="a9b932005843a6cca8e340dacc25efec1127dc195e20e0aca5f300f493c5ed44" Apr 16 22:33:15.591406 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:33:15.591382 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9b932005843a6cca8e340dacc25efec1127dc195e20e0aca5f300f493c5ed44\": container with ID starting with a9b932005843a6cca8e340dacc25efec1127dc195e20e0aca5f300f493c5ed44 not found: ID does not exist" containerID="a9b932005843a6cca8e340dacc25efec1127dc195e20e0aca5f300f493c5ed44" Apr 16 22:33:15.591468 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.591413 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9b932005843a6cca8e340dacc25efec1127dc195e20e0aca5f300f493c5ed44"} err="failed to get container status \"a9b932005843a6cca8e340dacc25efec1127dc195e20e0aca5f300f493c5ed44\": rpc error: code = NotFound desc = could not find container \"a9b932005843a6cca8e340dacc25efec1127dc195e20e0aca5f300f493c5ed44\": container with ID starting with a9b932005843a6cca8e340dacc25efec1127dc195e20e0aca5f300f493c5ed44 not found: ID does not exist" Apr 16 22:33:15.591468 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.591432 2562 scope.go:117] "RemoveContainer" containerID="b3c0084dd0f04677f2a576767592608e8a2cb24be6df85d76ff31d78d03d6827" Apr 16 22:33:15.591691 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:33:15.591672 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3c0084dd0f04677f2a576767592608e8a2cb24be6df85d76ff31d78d03d6827\": container with ID starting with b3c0084dd0f04677f2a576767592608e8a2cb24be6df85d76ff31d78d03d6827 not found: ID does not exist" containerID="b3c0084dd0f04677f2a576767592608e8a2cb24be6df85d76ff31d78d03d6827" Apr 16 22:33:15.591745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.591697 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3c0084dd0f04677f2a576767592608e8a2cb24be6df85d76ff31d78d03d6827"} err="failed to get container status \"b3c0084dd0f04677f2a576767592608e8a2cb24be6df85d76ff31d78d03d6827\": rpc error: code = NotFound desc = could not find container \"b3c0084dd0f04677f2a576767592608e8a2cb24be6df85d76ff31d78d03d6827\": container with ID starting with b3c0084dd0f04677f2a576767592608e8a2cb24be6df85d76ff31d78d03d6827 not found: ID does not exist" Apr 16 22:33:15.591745 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.591713 2562 scope.go:117] "RemoveContainer" containerID="15a32cfc1be70c946ea6b8320d1c0ca8f8815846bf845fb211f12d69d58ebd88" Apr 16 22:33:15.591957 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:33:15.591939 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15a32cfc1be70c946ea6b8320d1c0ca8f8815846bf845fb211f12d69d58ebd88\": container with ID starting with 15a32cfc1be70c946ea6b8320d1c0ca8f8815846bf845fb211f12d69d58ebd88 not found: ID does not exist" containerID="15a32cfc1be70c946ea6b8320d1c0ca8f8815846bf845fb211f12d69d58ebd88" Apr 16 22:33:15.592008 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.591961 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15a32cfc1be70c946ea6b8320d1c0ca8f8815846bf845fb211f12d69d58ebd88"} err="failed to get container status \"15a32cfc1be70c946ea6b8320d1c0ca8f8815846bf845fb211f12d69d58ebd88\": rpc error: code = NotFound desc = could not find container \"15a32cfc1be70c946ea6b8320d1c0ca8f8815846bf845fb211f12d69d58ebd88\": container with ID starting with 15a32cfc1be70c946ea6b8320d1c0ca8f8815846bf845fb211f12d69d58ebd88 not found: ID does not exist" Apr 16 22:33:15.596997 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.596972 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv"] Apr 16 22:33:15.608781 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:15.608759 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv"] Apr 16 22:33:16.131190 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.131154 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa6df016-f078-46dc-a601-4bdcdf15c0b9" path="/var/lib/kubelet/pods/aa6df016-f078-46dc-a601-4bdcdf15c0b9/volumes" Apr 16 22:33:16.193517 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.193476 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-59bdb57f66-96dwv" podUID="aa6df016-f078-46dc-a601-4bdcdf15c0b9" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.47:9003\" within 1s: context deadline exceeded" Apr 16 22:33:16.329315 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.329278 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-5954b95474-bjhk4"] Apr 16 22:33:16.329643 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.329629 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa6df016-f078-46dc-a601-4bdcdf15c0b9" containerName="storage-initializer" Apr 16 22:33:16.329643 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.329644 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa6df016-f078-46dc-a601-4bdcdf15c0b9" containerName="storage-initializer" Apr 16 22:33:16.329723 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.329653 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa6df016-f078-46dc-a601-4bdcdf15c0b9" containerName="main" Apr 16 22:33:16.329723 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.329659 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa6df016-f078-46dc-a601-4bdcdf15c0b9" containerName="main" Apr 16 22:33:16.329723 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.329682 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa6df016-f078-46dc-a601-4bdcdf15c0b9" containerName="tokenizer" Apr 16 22:33:16.329723 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.329687 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa6df016-f078-46dc-a601-4bdcdf15c0b9" containerName="tokenizer" Apr 16 22:33:16.329850 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.329737 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa6df016-f078-46dc-a601-4bdcdf15c0b9" containerName="main" Apr 16 22:33:16.329850 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.329746 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa6df016-f078-46dc-a601-4bdcdf15c0b9" containerName="tokenizer" Apr 16 22:33:16.334266 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.334245 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5954b95474-bjhk4" Apr 16 22:33:16.337094 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.337071 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 22:33:16.337217 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.337115 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-zbj7l\"" Apr 16 22:33:16.341526 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.341445 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5954b95474-bjhk4"] Apr 16 22:33:16.448817 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.448720 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b00a2301-8423-4bdc-b11e-4c1ccd6034c8-cert\") pod \"llmisvc-controller-manager-5954b95474-bjhk4\" (UID: \"b00a2301-8423-4bdc-b11e-4c1ccd6034c8\") " pod="kserve/llmisvc-controller-manager-5954b95474-bjhk4" Apr 16 22:33:16.448817 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.448778 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7cs9\" (UniqueName: \"kubernetes.io/projected/b00a2301-8423-4bdc-b11e-4c1ccd6034c8-kube-api-access-w7cs9\") pod \"llmisvc-controller-manager-5954b95474-bjhk4\" (UID: \"b00a2301-8423-4bdc-b11e-4c1ccd6034c8\") " pod="kserve/llmisvc-controller-manager-5954b95474-bjhk4" Apr 16 22:33:16.549834 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.549798 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7cs9\" (UniqueName: \"kubernetes.io/projected/b00a2301-8423-4bdc-b11e-4c1ccd6034c8-kube-api-access-w7cs9\") pod \"llmisvc-controller-manager-5954b95474-bjhk4\" (UID: \"b00a2301-8423-4bdc-b11e-4c1ccd6034c8\") " pod="kserve/llmisvc-controller-manager-5954b95474-bjhk4" Apr 16 22:33:16.550035 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.549881 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b00a2301-8423-4bdc-b11e-4c1ccd6034c8-cert\") pod \"llmisvc-controller-manager-5954b95474-bjhk4\" (UID: \"b00a2301-8423-4bdc-b11e-4c1ccd6034c8\") " pod="kserve/llmisvc-controller-manager-5954b95474-bjhk4" Apr 16 22:33:16.552210 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.552180 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b00a2301-8423-4bdc-b11e-4c1ccd6034c8-cert\") pod \"llmisvc-controller-manager-5954b95474-bjhk4\" (UID: \"b00a2301-8423-4bdc-b11e-4c1ccd6034c8\") " pod="kserve/llmisvc-controller-manager-5954b95474-bjhk4" Apr 16 22:33:16.558474 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.558436 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7cs9\" (UniqueName: \"kubernetes.io/projected/b00a2301-8423-4bdc-b11e-4c1ccd6034c8-kube-api-access-w7cs9\") pod \"llmisvc-controller-manager-5954b95474-bjhk4\" (UID: \"b00a2301-8423-4bdc-b11e-4c1ccd6034c8\") " pod="kserve/llmisvc-controller-manager-5954b95474-bjhk4" Apr 16 22:33:16.646071 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.646046 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5954b95474-bjhk4" Apr 16 22:33:16.765107 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:16.765080 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5954b95474-bjhk4"] Apr 16 22:33:16.767029 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:33:16.767001 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb00a2301_8423_4bdc_b11e_4c1ccd6034c8.slice/crio-71cb5b81fa4e5d441d0a1609d6da4d03bfc00c31247e002cd5fbb04f23ff6e0d WatchSource:0}: Error finding container 71cb5b81fa4e5d441d0a1609d6da4d03bfc00c31247e002cd5fbb04f23ff6e0d: Status 404 returned error can't find the container with id 71cb5b81fa4e5d441d0a1609d6da4d03bfc00c31247e002cd5fbb04f23ff6e0d Apr 16 22:33:17.574527 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:17.574493 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5954b95474-bjhk4" event={"ID":"b00a2301-8423-4bdc-b11e-4c1ccd6034c8","Type":"ContainerStarted","Data":"71cb5b81fa4e5d441d0a1609d6da4d03bfc00c31247e002cd5fbb04f23ff6e0d"} Apr 16 22:33:19.583089 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:19.583055 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5954b95474-bjhk4" event={"ID":"b00a2301-8423-4bdc-b11e-4c1ccd6034c8","Type":"ContainerStarted","Data":"cb9209ebd6bb15080d6cc5001a44551d16dc656ecc2e1e04074984b37a59971c"} Apr 16 22:33:19.583469 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:19.583153 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-5954b95474-bjhk4" Apr 16 22:33:19.599396 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:19.599346 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-5954b95474-bjhk4" podStartSLOduration=0.93532495 podStartE2EDuration="3.599331923s" podCreationTimestamp="2026-04-16 22:33:16 +0000 UTC" firstStartedPulling="2026-04-16 22:33:16.768702125 +0000 UTC m=+1189.251221239" lastFinishedPulling="2026-04-16 22:33:19.432709097 +0000 UTC m=+1191.915228212" observedRunningTime="2026-04-16 22:33:19.596775981 +0000 UTC m=+1192.079295131" watchObservedRunningTime="2026-04-16 22:33:19.599331923 +0000 UTC m=+1192.081851058" Apr 16 22:33:50.588947 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:33:50.588901 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-5954b95474-bjhk4" Apr 16 22:37:17.435043 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.434963 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 22:37:17.438444 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.438421 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.441669 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.441593 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:37:17.441669 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.441597 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 22:37:17.441669 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.441642 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pt4qs\"" Apr 16 22:37:17.441915 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.441686 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-l254r\"" Apr 16 22:37:17.441915 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.441619 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 22:37:17.447734 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.447713 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 22:37:17.489135 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.489105 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.489266 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.489141 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.489266 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.489172 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.489266 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.489202 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.489390 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.489275 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hls49\" (UniqueName: \"kubernetes.io/projected/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-kube-api-access-hls49\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.489390 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.489306 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.520384 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.520349 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt"] Apr 16 22:37:17.524712 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.524680 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.526826 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.526801 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-sd5dx\"" Apr 16 22:37:17.536173 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.536150 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt"] Apr 16 22:37:17.590408 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.590375 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n9f9\" (UniqueName: \"kubernetes.io/projected/dab21f30-9c84-4353-bb03-1b038759559b-kube-api-access-2n9f9\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.590563 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.590427 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.590563 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.590458 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hls49\" (UniqueName: \"kubernetes.io/projected/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-kube-api-access-hls49\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.590563 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.590508 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.590563 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.590544 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.590752 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.590574 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dab21f30-9c84-4353-bb03-1b038759559b-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.590752 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.590620 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.590752 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.590656 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.590752 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.590687 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.590752 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.590717 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.590995 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.590794 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.590995 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.590886 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.590995 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.590956 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.591145 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.591025 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.591145 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.591074 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.592785 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.592763 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.593008 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.592989 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.597786 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.597766 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hls49\" (UniqueName: \"kubernetes.io/projected/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-kube-api-access-hls49\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.691995 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.691894 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.691995 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.691962 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.691995 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.691994 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dab21f30-9c84-4353-bb03-1b038759559b-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.692298 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.692110 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.692298 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.692200 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.692298 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.692237 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2n9f9\" (UniqueName: \"kubernetes.io/projected/dab21f30-9c84-4353-bb03-1b038759559b-kube-api-access-2n9f9\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.692454 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.692318 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.692454 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.692369 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.692454 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.692435 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.692558 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.692502 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.694498 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.694476 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dab21f30-9c84-4353-bb03-1b038759559b-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.699414 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.699394 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n9f9\" (UniqueName: \"kubernetes.io/projected/dab21f30-9c84-4353-bb03-1b038759559b-kube-api-access-2n9f9\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.750582 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.750547 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:37:17.836131 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.836094 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:17.877974 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.877947 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 22:37:17.880848 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:37:17.880809 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bdfad1c_f2b2_4e4d_8353_df9d919002a9.slice/crio-1050b324cc6970387940f2516103ae2b8be9d21ac42528ee671db238ac458b2c WatchSource:0}: Error finding container 1050b324cc6970387940f2516103ae2b8be9d21ac42528ee671db238ac458b2c: Status 404 returned error can't find the container with id 1050b324cc6970387940f2516103ae2b8be9d21ac42528ee671db238ac458b2c Apr 16 22:37:17.882399 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.882380 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:37:17.976021 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:17.975995 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt"] Apr 16 22:37:17.977793 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:37:17.977766 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddab21f30_9c84_4353_bb03_1b038759559b.slice/crio-c852248209991401d850ad28d6dd0c2fc7f636bd011c4ef123228022d43e174f WatchSource:0}: Error finding container c852248209991401d850ad28d6dd0c2fc7f636bd011c4ef123228022d43e174f: Status 404 returned error can't find the container with id c852248209991401d850ad28d6dd0c2fc7f636bd011c4ef123228022d43e174f Apr 16 22:37:18.519809 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:18.519774 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" event={"ID":"dab21f30-9c84-4353-bb03-1b038759559b","Type":"ContainerStarted","Data":"9793a47f6092dc5c3209c21fa4d2b872619abd44cddce1d42b1af8b095f2e66c"} Apr 16 22:37:18.520270 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:18.519814 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" event={"ID":"dab21f30-9c84-4353-bb03-1b038759559b","Type":"ContainerStarted","Data":"c852248209991401d850ad28d6dd0c2fc7f636bd011c4ef123228022d43e174f"} Apr 16 22:37:18.521271 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:18.521248 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8bdfad1c-f2b2-4e4d-8353-df9d919002a9","Type":"ContainerStarted","Data":"7be50fd307028f5e5e8b3b6240e23c463556a922842b75e28d45d689c434ad4d"} Apr 16 22:37:18.521393 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:18.521275 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8bdfad1c-f2b2-4e4d-8353-df9d919002a9","Type":"ContainerStarted","Data":"1050b324cc6970387940f2516103ae2b8be9d21ac42528ee671db238ac458b2c"} Apr 16 22:37:19.526928 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:19.526885 2562 generic.go:358] "Generic (PLEG): container finished" podID="dab21f30-9c84-4353-bb03-1b038759559b" containerID="9793a47f6092dc5c3209c21fa4d2b872619abd44cddce1d42b1af8b095f2e66c" exitCode=0 Apr 16 22:37:19.527438 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:19.526951 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" event={"ID":"dab21f30-9c84-4353-bb03-1b038759559b","Type":"ContainerDied","Data":"9793a47f6092dc5c3209c21fa4d2b872619abd44cddce1d42b1af8b095f2e66c"} Apr 16 22:37:20.532796 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:20.532762 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" event={"ID":"dab21f30-9c84-4353-bb03-1b038759559b","Type":"ContainerStarted","Data":"2735413d7ef096c3a6f92cb3441965f8e7ab029ea8315043f839601519e243ce"} Apr 16 22:37:20.532796 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:20.532798 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" event={"ID":"dab21f30-9c84-4353-bb03-1b038759559b","Type":"ContainerStarted","Data":"e22b4a5ba9385cbc59328e7344ee7efd20282a5f9f24275e5740b00b49c06188"} Apr 16 22:37:20.533272 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:20.532879 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:20.554315 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:20.554246 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" podStartSLOduration=3.55422535 podStartE2EDuration="3.55422535s" podCreationTimestamp="2026-04-16 22:37:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:37:20.550933678 +0000 UTC m=+1433.033452843" watchObservedRunningTime="2026-04-16 22:37:20.55422535 +0000 UTC m=+1433.036744487" Apr 16 22:37:22.544259 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:22.544218 2562 generic.go:358] "Generic (PLEG): container finished" podID="8bdfad1c-f2b2-4e4d-8353-df9d919002a9" containerID="7be50fd307028f5e5e8b3b6240e23c463556a922842b75e28d45d689c434ad4d" exitCode=0 Apr 16 22:37:22.544687 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:22.544279 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8bdfad1c-f2b2-4e4d-8353-df9d919002a9","Type":"ContainerDied","Data":"7be50fd307028f5e5e8b3b6240e23c463556a922842b75e28d45d689c434ad4d"} Apr 16 22:37:27.837910 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:27.837415 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:27.837910 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:27.837455 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:27.839359 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:27.839204 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" podUID="dab21f30-9c84-4353-bb03-1b038759559b" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.50:8082/healthz\": dial tcp 10.132.0.50:8082: connect: connection refused" Apr 16 22:37:37.839014 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:37.838963 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:37.840357 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:37.840332 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:37:50.683099 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:50.683064 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8bdfad1c-f2b2-4e4d-8353-df9d919002a9","Type":"ContainerStarted","Data":"f5bfb187eff82791e67277bcf1a9d1a55ac52af902c3106513eabb69c765ad33"} Apr 16 22:37:50.702278 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:50.702223 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=6.5485022619999995 podStartE2EDuration="33.702204749s" podCreationTimestamp="2026-04-16 22:37:17 +0000 UTC" firstStartedPulling="2026-04-16 22:37:22.545547298 +0000 UTC m=+1435.028066412" lastFinishedPulling="2026-04-16 22:37:49.699249782 +0000 UTC m=+1462.181768899" observedRunningTime="2026-04-16 22:37:50.699240141 +0000 UTC m=+1463.181759278" watchObservedRunningTime="2026-04-16 22:37:50.702204749 +0000 UTC m=+1463.184723885" Apr 16 22:37:58.625451 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:37:58.625419 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:38:08.822267 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:08.822194 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5"] Apr 16 22:38:09.326753 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.326715 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5"] Apr 16 22:38:09.326921 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.326851 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.329320 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.329298 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-9crbd\"" Apr 16 22:38:09.329457 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.329342 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 22:38:09.461776 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.461746 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqp55\" (UniqueName: \"kubernetes.io/projected/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-kube-api-access-sqp55\") pod \"custom-route-timeout-pd-test-kserve-55895d758-8xvx5\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.461939 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.461794 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-55895d758-8xvx5\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.461939 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.461822 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-model-cache\") pod \"custom-route-timeout-pd-test-kserve-55895d758-8xvx5\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.461939 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.461852 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-55895d758-8xvx5\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.461939 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.461892 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-home\") pod \"custom-route-timeout-pd-test-kserve-55895d758-8xvx5\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.461939 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.461925 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-dshm\") pod \"custom-route-timeout-pd-test-kserve-55895d758-8xvx5\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.562546 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.562509 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqp55\" (UniqueName: \"kubernetes.io/projected/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-kube-api-access-sqp55\") pod \"custom-route-timeout-pd-test-kserve-55895d758-8xvx5\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.562722 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.562554 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-55895d758-8xvx5\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.562722 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.562581 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-model-cache\") pod \"custom-route-timeout-pd-test-kserve-55895d758-8xvx5\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.562722 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.562638 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-55895d758-8xvx5\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.562722 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.562693 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-home\") pod \"custom-route-timeout-pd-test-kserve-55895d758-8xvx5\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.562722 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.562714 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-dshm\") pod \"custom-route-timeout-pd-test-kserve-55895d758-8xvx5\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.563235 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.563205 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-home\") pod \"custom-route-timeout-pd-test-kserve-55895d758-8xvx5\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.563425 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.563217 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-55895d758-8xvx5\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.563554 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.563270 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-model-cache\") pod \"custom-route-timeout-pd-test-kserve-55895d758-8xvx5\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.565058 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.565037 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-dshm\") pod \"custom-route-timeout-pd-test-kserve-55895d758-8xvx5\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.565203 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.565188 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-55895d758-8xvx5\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.571050 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.571031 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqp55\" (UniqueName: \"kubernetes.io/projected/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-kube-api-access-sqp55\") pod \"custom-route-timeout-pd-test-kserve-55895d758-8xvx5\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.637098 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.637038 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:09.775881 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:09.775852 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5"] Apr 16 22:38:09.777453 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:38:09.777423 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b8aeac7_5a6b_4221_bed7_1846d615d3cc.slice/crio-ec453fef8da45cf3083dffe035b4be38d8cf70e377a370c2d5dd1fd29ac27c97 WatchSource:0}: Error finding container ec453fef8da45cf3083dffe035b4be38d8cf70e377a370c2d5dd1fd29ac27c97: Status 404 returned error can't find the container with id ec453fef8da45cf3083dffe035b4be38d8cf70e377a370c2d5dd1fd29ac27c97 Apr 16 22:38:10.763893 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:10.763857 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" event={"ID":"6b8aeac7-5a6b-4221-bed7-1846d615d3cc","Type":"ContainerStarted","Data":"ec453fef8da45cf3083dffe035b4be38d8cf70e377a370c2d5dd1fd29ac27c97"} Apr 16 22:38:11.769283 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:11.769247 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" event={"ID":"6b8aeac7-5a6b-4221-bed7-1846d615d3cc","Type":"ContainerStarted","Data":"7e1eacd4620335170cc67b09df3a4dfa79fbe541086ea039fea51da69848bf59"} Apr 16 22:38:11.769679 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:11.769360 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:12.775435 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:12.775402 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" event={"ID":"6b8aeac7-5a6b-4221-bed7-1846d615d3cc","Type":"ContainerStarted","Data":"a2330f24f818e51c80b5b5cc28d4148fd6e2c560a3ed06466489f259e8e55713"} Apr 16 22:38:16.801452 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:16.801415 2562 generic.go:358] "Generic (PLEG): container finished" podID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerID="a2330f24f818e51c80b5b5cc28d4148fd6e2c560a3ed06466489f259e8e55713" exitCode=0 Apr 16 22:38:16.801846 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:16.801487 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" event={"ID":"6b8aeac7-5a6b-4221-bed7-1846d615d3cc","Type":"ContainerDied","Data":"a2330f24f818e51c80b5b5cc28d4148fd6e2c560a3ed06466489f259e8e55713"} Apr 16 22:38:17.808436 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:17.808398 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" event={"ID":"6b8aeac7-5a6b-4221-bed7-1846d615d3cc","Type":"ContainerStarted","Data":"4cf29feb42f9be1d505288a5d2f43302c1cd003ab017bf04f41805f58c1c2722"} Apr 16 22:38:17.831250 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:17.831184 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" podStartSLOduration=8.490202783 podStartE2EDuration="9.831163874s" podCreationTimestamp="2026-04-16 22:38:08 +0000 UTC" firstStartedPulling="2026-04-16 22:38:09.779779884 +0000 UTC m=+1482.262298998" lastFinishedPulling="2026-04-16 22:38:11.120740976 +0000 UTC m=+1483.603260089" observedRunningTime="2026-04-16 22:38:17.827647191 +0000 UTC m=+1490.310166328" watchObservedRunningTime="2026-04-16 22:38:17.831163874 +0000 UTC m=+1490.313683006" Apr 16 22:38:19.637566 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:19.637530 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:19.637566 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:19.637573 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:19.639023 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:19.638994 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8001/health\": dial tcp 10.132.0.51:8001: connect: connection refused" Apr 16 22:38:29.637556 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:29.637511 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8001/health\": dial tcp 10.132.0.51:8001: connect: connection refused" Apr 16 22:38:29.650144 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:29.650117 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:38:39.638344 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:39.638296 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8001/health\": dial tcp 10.132.0.51:8001: connect: connection refused" Apr 16 22:38:49.638307 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:49.638195 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8001/health\": dial tcp 10.132.0.51:8001: connect: connection refused" Apr 16 22:38:59.637425 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:38:59.637378 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8001/health\": dial tcp 10.132.0.51:8001: connect: connection refused" Apr 16 22:39:06.389537 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:06.389482 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt"] Apr 16 22:39:06.390063 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:06.389947 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" podUID="dab21f30-9c84-4353-bb03-1b038759559b" containerName="main" containerID="cri-o://e22b4a5ba9385cbc59328e7344ee7efd20282a5f9f24275e5740b00b49c06188" gracePeriod=30 Apr 16 22:39:06.390131 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:06.390029 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" podUID="dab21f30-9c84-4353-bb03-1b038759559b" containerName="tokenizer" containerID="cri-o://2735413d7ef096c3a6f92cb3441965f8e7ab029ea8315043f839601519e243ce" gracePeriod=30 Apr 16 22:39:07.033361 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:07.033322 2562 generic.go:358] "Generic (PLEG): container finished" podID="dab21f30-9c84-4353-bb03-1b038759559b" containerID="e22b4a5ba9385cbc59328e7344ee7efd20282a5f9f24275e5740b00b49c06188" exitCode=0 Apr 16 22:39:07.033566 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:07.033392 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" event={"ID":"dab21f30-9c84-4353-bb03-1b038759559b","Type":"ContainerDied","Data":"e22b4a5ba9385cbc59328e7344ee7efd20282a5f9f24275e5740b00b49c06188"} Apr 16 22:39:07.757466 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:07.757433 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:39:07.908873 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:07.908834 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n9f9\" (UniqueName: \"kubernetes.io/projected/dab21f30-9c84-4353-bb03-1b038759559b-kube-api-access-2n9f9\") pod \"dab21f30-9c84-4353-bb03-1b038759559b\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " Apr 16 22:39:07.909069 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:07.908904 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-kserve-provision-location\") pod \"dab21f30-9c84-4353-bb03-1b038759559b\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " Apr 16 22:39:07.909069 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:07.908935 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-tokenizer-tmp\") pod \"dab21f30-9c84-4353-bb03-1b038759559b\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " Apr 16 22:39:07.909069 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:07.908961 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dab21f30-9c84-4353-bb03-1b038759559b-tls-certs\") pod \"dab21f30-9c84-4353-bb03-1b038759559b\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " Apr 16 22:39:07.909069 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:07.908994 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-tokenizer-uds\") pod \"dab21f30-9c84-4353-bb03-1b038759559b\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " Apr 16 22:39:07.909069 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:07.909023 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-tokenizer-cache\") pod \"dab21f30-9c84-4353-bb03-1b038759559b\" (UID: \"dab21f30-9c84-4353-bb03-1b038759559b\") " Apr 16 22:39:07.909361 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:07.909236 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "dab21f30-9c84-4353-bb03-1b038759559b" (UID: "dab21f30-9c84-4353-bb03-1b038759559b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:07.909361 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:07.909267 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "dab21f30-9c84-4353-bb03-1b038759559b" (UID: "dab21f30-9c84-4353-bb03-1b038759559b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:07.909361 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:07.909311 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "dab21f30-9c84-4353-bb03-1b038759559b" (UID: "dab21f30-9c84-4353-bb03-1b038759559b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:07.909839 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:07.909814 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dab21f30-9c84-4353-bb03-1b038759559b" (UID: "dab21f30-9c84-4353-bb03-1b038759559b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:07.910980 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:07.910948 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dab21f30-9c84-4353-bb03-1b038759559b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "dab21f30-9c84-4353-bb03-1b038759559b" (UID: "dab21f30-9c84-4353-bb03-1b038759559b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:39:07.911093 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:07.911037 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab21f30-9c84-4353-bb03-1b038759559b-kube-api-access-2n9f9" (OuterVolumeSpecName: "kube-api-access-2n9f9") pod "dab21f30-9c84-4353-bb03-1b038759559b" (UID: "dab21f30-9c84-4353-bb03-1b038759559b"). InnerVolumeSpecName "kube-api-access-2n9f9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:39:08.010335 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.010239 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-kserve-provision-location\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:39:08.010335 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.010270 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-tokenizer-tmp\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:39:08.010335 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.010284 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dab21f30-9c84-4353-bb03-1b038759559b-tls-certs\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:39:08.010335 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.010296 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-tokenizer-uds\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:39:08.010335 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.010309 2562 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dab21f30-9c84-4353-bb03-1b038759559b-tokenizer-cache\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:39:08.010335 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.010337 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2n9f9\" (UniqueName: \"kubernetes.io/projected/dab21f30-9c84-4353-bb03-1b038759559b-kube-api-access-2n9f9\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:39:08.032794 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.032764 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 22:39:08.033040 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.033019 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="8bdfad1c-f2b2-4e4d-8353-df9d919002a9" containerName="main" containerID="cri-o://f5bfb187eff82791e67277bcf1a9d1a55ac52af902c3106513eabb69c765ad33" gracePeriod=30 Apr 16 22:39:08.039933 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.039911 2562 generic.go:358] "Generic (PLEG): container finished" podID="dab21f30-9c84-4353-bb03-1b038759559b" containerID="2735413d7ef096c3a6f92cb3441965f8e7ab029ea8315043f839601519e243ce" exitCode=0 Apr 16 22:39:08.040038 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.039996 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" Apr 16 22:39:08.040082 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.039993 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" event={"ID":"dab21f30-9c84-4353-bb03-1b038759559b","Type":"ContainerDied","Data":"2735413d7ef096c3a6f92cb3441965f8e7ab029ea8315043f839601519e243ce"} Apr 16 22:39:08.040122 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.040100 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt" event={"ID":"dab21f30-9c84-4353-bb03-1b038759559b","Type":"ContainerDied","Data":"c852248209991401d850ad28d6dd0c2fc7f636bd011c4ef123228022d43e174f"} Apr 16 22:39:08.040122 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.040117 2562 scope.go:117] "RemoveContainer" containerID="2735413d7ef096c3a6f92cb3441965f8e7ab029ea8315043f839601519e243ce" Apr 16 22:39:08.048716 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.048693 2562 scope.go:117] "RemoveContainer" containerID="e22b4a5ba9385cbc59328e7344ee7efd20282a5f9f24275e5740b00b49c06188" Apr 16 22:39:08.056671 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.056654 2562 scope.go:117] "RemoveContainer" containerID="9793a47f6092dc5c3209c21fa4d2b872619abd44cddce1d42b1af8b095f2e66c" Apr 16 22:39:08.064270 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.064248 2562 scope.go:117] "RemoveContainer" containerID="2735413d7ef096c3a6f92cb3441965f8e7ab029ea8315043f839601519e243ce" Apr 16 22:39:08.064570 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:39:08.064534 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2735413d7ef096c3a6f92cb3441965f8e7ab029ea8315043f839601519e243ce\": container with ID starting with 2735413d7ef096c3a6f92cb3441965f8e7ab029ea8315043f839601519e243ce not found: ID does not exist" containerID="2735413d7ef096c3a6f92cb3441965f8e7ab029ea8315043f839601519e243ce" Apr 16 22:39:08.064691 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.064571 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2735413d7ef096c3a6f92cb3441965f8e7ab029ea8315043f839601519e243ce"} err="failed to get container status \"2735413d7ef096c3a6f92cb3441965f8e7ab029ea8315043f839601519e243ce\": rpc error: code = NotFound desc = could not find container \"2735413d7ef096c3a6f92cb3441965f8e7ab029ea8315043f839601519e243ce\": container with ID starting with 2735413d7ef096c3a6f92cb3441965f8e7ab029ea8315043f839601519e243ce not found: ID does not exist" Apr 16 22:39:08.064691 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.064594 2562 scope.go:117] "RemoveContainer" containerID="e22b4a5ba9385cbc59328e7344ee7efd20282a5f9f24275e5740b00b49c06188" Apr 16 22:39:08.065141 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:39:08.065052 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e22b4a5ba9385cbc59328e7344ee7efd20282a5f9f24275e5740b00b49c06188\": container with ID starting with e22b4a5ba9385cbc59328e7344ee7efd20282a5f9f24275e5740b00b49c06188 not found: ID does not exist" containerID="e22b4a5ba9385cbc59328e7344ee7efd20282a5f9f24275e5740b00b49c06188" Apr 16 22:39:08.065141 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.065105 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e22b4a5ba9385cbc59328e7344ee7efd20282a5f9f24275e5740b00b49c06188"} err="failed to get container status \"e22b4a5ba9385cbc59328e7344ee7efd20282a5f9f24275e5740b00b49c06188\": rpc error: code = NotFound desc = could not find container \"e22b4a5ba9385cbc59328e7344ee7efd20282a5f9f24275e5740b00b49c06188\": container with ID starting with e22b4a5ba9385cbc59328e7344ee7efd20282a5f9f24275e5740b00b49c06188 not found: ID does not exist" Apr 16 22:39:08.065141 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.065127 2562 scope.go:117] "RemoveContainer" containerID="9793a47f6092dc5c3209c21fa4d2b872619abd44cddce1d42b1af8b095f2e66c" Apr 16 22:39:08.065477 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:39:08.065428 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9793a47f6092dc5c3209c21fa4d2b872619abd44cddce1d42b1af8b095f2e66c\": container with ID starting with 9793a47f6092dc5c3209c21fa4d2b872619abd44cddce1d42b1af8b095f2e66c not found: ID does not exist" containerID="9793a47f6092dc5c3209c21fa4d2b872619abd44cddce1d42b1af8b095f2e66c" Apr 16 22:39:08.065477 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.065464 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9793a47f6092dc5c3209c21fa4d2b872619abd44cddce1d42b1af8b095f2e66c"} err="failed to get container status \"9793a47f6092dc5c3209c21fa4d2b872619abd44cddce1d42b1af8b095f2e66c\": rpc error: code = NotFound desc = could not find container \"9793a47f6092dc5c3209c21fa4d2b872619abd44cddce1d42b1af8b095f2e66c\": container with ID starting with 9793a47f6092dc5c3209c21fa4d2b872619abd44cddce1d42b1af8b095f2e66c not found: ID does not exist" Apr 16 22:39:08.069374 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.069342 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt"] Apr 16 22:39:08.073906 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.073886 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schervxdt"] Apr 16 22:39:08.132241 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:08.132206 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab21f30-9c84-4353-bb03-1b038759559b" path="/var/lib/kubelet/pods/dab21f30-9c84-4353-bb03-1b038759559b/volumes" Apr 16 22:39:09.638234 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:09.638191 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8001/health\": dial tcp 10.132.0.51:8001: connect: connection refused" Apr 16 22:39:19.638448 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:19.638402 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8001/health\": dial tcp 10.132.0.51:8001: connect: connection refused" Apr 16 22:39:29.637538 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:29.637490 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8001/health\": dial tcp 10.132.0.51:8001: connect: connection refused" Apr 16 22:39:38.169446 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.169415 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_8bdfad1c-f2b2-4e4d-8353-df9d919002a9/main/0.log" Apr 16 22:39:38.169821 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.169796 2562 generic.go:358] "Generic (PLEG): container finished" podID="8bdfad1c-f2b2-4e4d-8353-df9d919002a9" containerID="f5bfb187eff82791e67277bcf1a9d1a55ac52af902c3106513eabb69c765ad33" exitCode=137 Apr 16 22:39:38.169871 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.169849 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8bdfad1c-f2b2-4e4d-8353-df9d919002a9","Type":"ContainerDied","Data":"f5bfb187eff82791e67277bcf1a9d1a55ac52af902c3106513eabb69c765ad33"} Apr 16 22:39:38.691312 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.691287 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_8bdfad1c-f2b2-4e4d-8353-df9d919002a9/main/0.log" Apr 16 22:39:38.691679 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.691662 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:39:38.771349 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.771324 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-kserve-provision-location\") pod \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " Apr 16 22:39:38.771510 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.771357 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-model-cache\") pod \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " Apr 16 22:39:38.771510 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.771379 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-tls-certs\") pod \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " Apr 16 22:39:38.771510 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.771446 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hls49\" (UniqueName: \"kubernetes.io/projected/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-kube-api-access-hls49\") pod \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " Apr 16 22:39:38.771510 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.771468 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-dshm\") pod \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " Apr 16 22:39:38.771775 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.771597 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-home\") pod \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\" (UID: \"8bdfad1c-f2b2-4e4d-8353-df9d919002a9\") " Apr 16 22:39:38.771775 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.771590 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-model-cache" (OuterVolumeSpecName: "model-cache") pod "8bdfad1c-f2b2-4e4d-8353-df9d919002a9" (UID: "8bdfad1c-f2b2-4e4d-8353-df9d919002a9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:38.771937 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.771916 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-model-cache\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:39:38.772025 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.772003 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-home" (OuterVolumeSpecName: "home") pod "8bdfad1c-f2b2-4e4d-8353-df9d919002a9" (UID: "8bdfad1c-f2b2-4e4d-8353-df9d919002a9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:38.773829 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.773796 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-kube-api-access-hls49" (OuterVolumeSpecName: "kube-api-access-hls49") pod "8bdfad1c-f2b2-4e4d-8353-df9d919002a9" (UID: "8bdfad1c-f2b2-4e4d-8353-df9d919002a9"). InnerVolumeSpecName "kube-api-access-hls49". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:39:38.773958 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.773803 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-dshm" (OuterVolumeSpecName: "dshm") pod "8bdfad1c-f2b2-4e4d-8353-df9d919002a9" (UID: "8bdfad1c-f2b2-4e4d-8353-df9d919002a9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:38.773958 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.773848 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8bdfad1c-f2b2-4e4d-8353-df9d919002a9" (UID: "8bdfad1c-f2b2-4e4d-8353-df9d919002a9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:39:38.803864 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.803795 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8bdfad1c-f2b2-4e4d-8353-df9d919002a9" (UID: "8bdfad1c-f2b2-4e4d-8353-df9d919002a9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:38.873030 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.872992 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hls49\" (UniqueName: \"kubernetes.io/projected/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-kube-api-access-hls49\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:39:38.873030 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.873018 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-dshm\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:39:38.873030 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.873027 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-home\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:39:38.873030 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.873036 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-kserve-provision-location\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:39:38.873351 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:38.873046 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bdfad1c-f2b2-4e4d-8353-df9d919002a9-tls-certs\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:39:39.174874 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:39.174790 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_8bdfad1c-f2b2-4e4d-8353-df9d919002a9/main/0.log" Apr 16 22:39:39.175327 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:39.175229 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 22:39:39.175327 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:39.175251 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8bdfad1c-f2b2-4e4d-8353-df9d919002a9","Type":"ContainerDied","Data":"1050b324cc6970387940f2516103ae2b8be9d21ac42528ee671db238ac458b2c"} Apr 16 22:39:39.175327 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:39.175295 2562 scope.go:117] "RemoveContainer" containerID="f5bfb187eff82791e67277bcf1a9d1a55ac52af902c3106513eabb69c765ad33" Apr 16 22:39:39.195783 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:39.195767 2562 scope.go:117] "RemoveContainer" containerID="7be50fd307028f5e5e8b3b6240e23c463556a922842b75e28d45d689c434ad4d" Apr 16 22:39:39.199003 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:39.198978 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 22:39:39.205554 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:39.205531 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 22:39:39.637847 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:39.637799 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8001/health\": dial tcp 10.132.0.51:8001: connect: connection refused" Apr 16 22:39:40.132011 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:40.131975 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bdfad1c-f2b2-4e4d-8353-df9d919002a9" path="/var/lib/kubelet/pods/8bdfad1c-f2b2-4e4d-8353-df9d919002a9/volumes" Apr 16 22:39:49.638341 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:49.638293 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.51:8001/health\": dial tcp 10.132.0.51:8001: connect: connection refused" Apr 16 22:39:59.647528 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:59.647495 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:39:59.664221 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:39:59.664197 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:40:11.010164 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:11.010131 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5"] Apr 16 22:40:11.010599 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:11.010486 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="main" containerID="cri-o://4cf29feb42f9be1d505288a5d2f43302c1cd003ab017bf04f41805f58c1c2722" gracePeriod=30 Apr 16 22:40:24.844233 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.844153 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk"] Apr 16 22:40:24.844565 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.844495 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bdfad1c-f2b2-4e4d-8353-df9d919002a9" containerName="main" Apr 16 22:40:24.844565 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.844505 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bdfad1c-f2b2-4e4d-8353-df9d919002a9" containerName="main" Apr 16 22:40:24.844565 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.844516 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dab21f30-9c84-4353-bb03-1b038759559b" containerName="storage-initializer" Apr 16 22:40:24.844565 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.844522 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab21f30-9c84-4353-bb03-1b038759559b" containerName="storage-initializer" Apr 16 22:40:24.844565 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.844531 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dab21f30-9c84-4353-bb03-1b038759559b" containerName="main" Apr 16 22:40:24.844565 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.844536 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab21f30-9c84-4353-bb03-1b038759559b" containerName="main" Apr 16 22:40:24.844565 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.844554 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bdfad1c-f2b2-4e4d-8353-df9d919002a9" containerName="storage-initializer" Apr 16 22:40:24.844565 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.844560 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bdfad1c-f2b2-4e4d-8353-df9d919002a9" containerName="storage-initializer" Apr 16 22:40:24.844565 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.844567 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dab21f30-9c84-4353-bb03-1b038759559b" containerName="tokenizer" Apr 16 22:40:24.844891 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.844572 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab21f30-9c84-4353-bb03-1b038759559b" containerName="tokenizer" Apr 16 22:40:24.844891 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.844653 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="dab21f30-9c84-4353-bb03-1b038759559b" containerName="main" Apr 16 22:40:24.844891 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.844664 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bdfad1c-f2b2-4e4d-8353-df9d919002a9" containerName="main" Apr 16 22:40:24.844891 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.844672 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="dab21f30-9c84-4353-bb03-1b038759559b" containerName="tokenizer" Apr 16 22:40:24.848288 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.848258 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:24.850511 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.850489 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 22:40:24.856154 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.856132 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk"] Apr 16 22:40:24.970124 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.970077 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg8l7\" (UniqueName: \"kubernetes.io/projected/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-kube-api-access-sg8l7\") pod \"router-with-refs-pd-test-kserve-prefill-bf495787-qshsk\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:24.970124 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.970125 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-home\") pod \"router-with-refs-pd-test-kserve-prefill-bf495787-qshsk\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:24.970370 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.970159 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-bf495787-qshsk\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:24.970370 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.970195 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-bf495787-qshsk\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:24.970370 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.970218 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-bf495787-qshsk\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:24.970370 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:24.970259 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-bf495787-qshsk\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:25.071644 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:25.071582 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-bf495787-qshsk\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:25.071644 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:25.071648 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-bf495787-qshsk\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:25.071906 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:25.071699 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-bf495787-qshsk\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:25.071906 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:25.071725 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg8l7\" (UniqueName: \"kubernetes.io/projected/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-kube-api-access-sg8l7\") pod \"router-with-refs-pd-test-kserve-prefill-bf495787-qshsk\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:25.072016 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:25.071904 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-home\") pod \"router-with-refs-pd-test-kserve-prefill-bf495787-qshsk\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:25.072016 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:25.071927 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-bf495787-qshsk\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:25.072230 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:25.072206 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-bf495787-qshsk\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:25.072292 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:25.072238 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-home\") pod \"router-with-refs-pd-test-kserve-prefill-bf495787-qshsk\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:25.072292 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:25.072264 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-bf495787-qshsk\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:25.074000 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:25.073971 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-bf495787-qshsk\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:25.074220 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:25.074204 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-bf495787-qshsk\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:25.079110 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:25.079087 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg8l7\" (UniqueName: \"kubernetes.io/projected/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-kube-api-access-sg8l7\") pod \"router-with-refs-pd-test-kserve-prefill-bf495787-qshsk\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:25.159589 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:25.159518 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:25.293272 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:25.293239 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk"] Apr 16 22:40:25.294879 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:40:25.294848 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3c8d9b1_a7c7_4edd_99c6_6cf2e73020e8.slice/crio-d3dda249f08f11b2418f06d14cfa718e0dcccee3335cd272edf78be570514e7f WatchSource:0}: Error finding container d3dda249f08f11b2418f06d14cfa718e0dcccee3335cd272edf78be570514e7f: Status 404 returned error can't find the container with id d3dda249f08f11b2418f06d14cfa718e0dcccee3335cd272edf78be570514e7f Apr 16 22:40:25.371589 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:25.371553 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" event={"ID":"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8","Type":"ContainerStarted","Data":"66475f8421a2e08144ae64376eb1544249a9581bd6e751c53085f13a89f3568a"} Apr 16 22:40:25.371589 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:25.371590 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" event={"ID":"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8","Type":"ContainerStarted","Data":"d3dda249f08f11b2418f06d14cfa718e0dcccee3335cd272edf78be570514e7f"} Apr 16 22:40:30.392686 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:30.392652 2562 generic.go:358] "Generic (PLEG): container finished" podID="c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" containerID="66475f8421a2e08144ae64376eb1544249a9581bd6e751c53085f13a89f3568a" exitCode=0 Apr 16 22:40:30.393078 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:30.392691 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" event={"ID":"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8","Type":"ContainerDied","Data":"66475f8421a2e08144ae64376eb1544249a9581bd6e751c53085f13a89f3568a"} Apr 16 22:40:31.398697 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:31.398656 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" event={"ID":"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8","Type":"ContainerStarted","Data":"d4cf3de7591964e5200de4c273a6ed5cd2aff0fc71ab0d5ec2451f29807fcd8a"} Apr 16 22:40:31.420572 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:31.420513 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" podStartSLOduration=7.420494689 podStartE2EDuration="7.420494689s" podCreationTimestamp="2026-04-16 22:40:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:40:31.416895697 +0000 UTC m=+1623.899414834" watchObservedRunningTime="2026-04-16 22:40:31.420494689 +0000 UTC m=+1623.903013827" Apr 16 22:40:35.160404 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:35.160361 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:35.160404 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:35.160414 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:40:35.161680 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:35.161646 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" podUID="c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 16 22:40:41.010715 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.010663 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="llm-d-routing-sidecar" containerID="cri-o://7e1eacd4620335170cc67b09df3a4dfa79fbe541086ea039fea51da69848bf59" gracePeriod=2 Apr 16 22:40:41.413955 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.413928 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-55895d758-8xvx5_6b8aeac7-5a6b-4221-bed7-1846d615d3cc/main/0.log" Apr 16 22:40:41.414723 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.414704 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:40:41.441046 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.441022 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-55895d758-8xvx5_6b8aeac7-5a6b-4221-bed7-1846d615d3cc/main/0.log" Apr 16 22:40:41.441794 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.441760 2562 generic.go:358] "Generic (PLEG): container finished" podID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerID="4cf29feb42f9be1d505288a5d2f43302c1cd003ab017bf04f41805f58c1c2722" exitCode=137 Apr 16 22:40:41.441794 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.441791 2562 generic.go:358] "Generic (PLEG): container finished" podID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerID="7e1eacd4620335170cc67b09df3a4dfa79fbe541086ea039fea51da69848bf59" exitCode=0 Apr 16 22:40:41.441997 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.441887 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" event={"ID":"6b8aeac7-5a6b-4221-bed7-1846d615d3cc","Type":"ContainerDied","Data":"4cf29feb42f9be1d505288a5d2f43302c1cd003ab017bf04f41805f58c1c2722"} Apr 16 22:40:41.441997 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.441918 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" event={"ID":"6b8aeac7-5a6b-4221-bed7-1846d615d3cc","Type":"ContainerDied","Data":"7e1eacd4620335170cc67b09df3a4dfa79fbe541086ea039fea51da69848bf59"} Apr 16 22:40:41.441997 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.441935 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" event={"ID":"6b8aeac7-5a6b-4221-bed7-1846d615d3cc","Type":"ContainerDied","Data":"ec453fef8da45cf3083dffe035b4be38d8cf70e377a370c2d5dd1fd29ac27c97"} Apr 16 22:40:41.441997 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.441942 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5" Apr 16 22:40:41.441997 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.441954 2562 scope.go:117] "RemoveContainer" containerID="4cf29feb42f9be1d505288a5d2f43302c1cd003ab017bf04f41805f58c1c2722" Apr 16 22:40:41.466774 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.466577 2562 scope.go:117] "RemoveContainer" containerID="a2330f24f818e51c80b5b5cc28d4148fd6e2c560a3ed06466489f259e8e55713" Apr 16 22:40:41.515276 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.515243 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-dshm\") pod \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " Apr 16 22:40:41.515438 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.515295 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-kserve-provision-location\") pod \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " Apr 16 22:40:41.515438 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.515324 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-model-cache\") pod \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " Apr 16 22:40:41.515438 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.515393 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-tls-certs\") pod \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " Apr 16 22:40:41.515629 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.515444 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-home\") pod \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " Apr 16 22:40:41.515629 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.515487 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqp55\" (UniqueName: \"kubernetes.io/projected/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-kube-api-access-sqp55\") pod \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\" (UID: \"6b8aeac7-5a6b-4221-bed7-1846d615d3cc\") " Apr 16 22:40:41.516689 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.516482 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-home" (OuterVolumeSpecName: "home") pod "6b8aeac7-5a6b-4221-bed7-1846d615d3cc" (UID: "6b8aeac7-5a6b-4221-bed7-1846d615d3cc"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:40:41.516689 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.516645 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-model-cache" (OuterVolumeSpecName: "model-cache") pod "6b8aeac7-5a6b-4221-bed7-1846d615d3cc" (UID: "6b8aeac7-5a6b-4221-bed7-1846d615d3cc"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:40:41.532308 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.532276 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-dshm" (OuterVolumeSpecName: "dshm") pod "6b8aeac7-5a6b-4221-bed7-1846d615d3cc" (UID: "6b8aeac7-5a6b-4221-bed7-1846d615d3cc"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:40:41.532440 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.532339 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6b8aeac7-5a6b-4221-bed7-1846d615d3cc" (UID: "6b8aeac7-5a6b-4221-bed7-1846d615d3cc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:40:41.543566 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.541133 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-kube-api-access-sqp55" (OuterVolumeSpecName: "kube-api-access-sqp55") pod "6b8aeac7-5a6b-4221-bed7-1846d615d3cc" (UID: "6b8aeac7-5a6b-4221-bed7-1846d615d3cc"). InnerVolumeSpecName "kube-api-access-sqp55". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:40:41.549900 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.549878 2562 scope.go:117] "RemoveContainer" containerID="7e1eacd4620335170cc67b09df3a4dfa79fbe541086ea039fea51da69848bf59" Apr 16 22:40:41.558208 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.558191 2562 scope.go:117] "RemoveContainer" containerID="4cf29feb42f9be1d505288a5d2f43302c1cd003ab017bf04f41805f58c1c2722" Apr 16 22:40:41.558488 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:40:41.558464 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cf29feb42f9be1d505288a5d2f43302c1cd003ab017bf04f41805f58c1c2722\": container with ID starting with 4cf29feb42f9be1d505288a5d2f43302c1cd003ab017bf04f41805f58c1c2722 not found: ID does not exist" containerID="4cf29feb42f9be1d505288a5d2f43302c1cd003ab017bf04f41805f58c1c2722" Apr 16 22:40:41.558563 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.558497 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cf29feb42f9be1d505288a5d2f43302c1cd003ab017bf04f41805f58c1c2722"} err="failed to get container status \"4cf29feb42f9be1d505288a5d2f43302c1cd003ab017bf04f41805f58c1c2722\": rpc error: code = NotFound desc = could not find container \"4cf29feb42f9be1d505288a5d2f43302c1cd003ab017bf04f41805f58c1c2722\": container with ID starting with 4cf29feb42f9be1d505288a5d2f43302c1cd003ab017bf04f41805f58c1c2722 not found: ID does not exist" Apr 16 22:40:41.558563 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.558517 2562 scope.go:117] "RemoveContainer" containerID="a2330f24f818e51c80b5b5cc28d4148fd6e2c560a3ed06466489f259e8e55713" Apr 16 22:40:41.558787 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:40:41.558769 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2330f24f818e51c80b5b5cc28d4148fd6e2c560a3ed06466489f259e8e55713\": container with ID starting with a2330f24f818e51c80b5b5cc28d4148fd6e2c560a3ed06466489f259e8e55713 not found: ID does not exist" containerID="a2330f24f818e51c80b5b5cc28d4148fd6e2c560a3ed06466489f259e8e55713" Apr 16 22:40:41.558830 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.558795 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2330f24f818e51c80b5b5cc28d4148fd6e2c560a3ed06466489f259e8e55713"} err="failed to get container status \"a2330f24f818e51c80b5b5cc28d4148fd6e2c560a3ed06466489f259e8e55713\": rpc error: code = NotFound desc = could not find container \"a2330f24f818e51c80b5b5cc28d4148fd6e2c560a3ed06466489f259e8e55713\": container with ID starting with a2330f24f818e51c80b5b5cc28d4148fd6e2c560a3ed06466489f259e8e55713 not found: ID does not exist" Apr 16 22:40:41.558830 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.558810 2562 scope.go:117] "RemoveContainer" containerID="7e1eacd4620335170cc67b09df3a4dfa79fbe541086ea039fea51da69848bf59" Apr 16 22:40:41.559059 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:40:41.559043 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e1eacd4620335170cc67b09df3a4dfa79fbe541086ea039fea51da69848bf59\": container with ID starting with 7e1eacd4620335170cc67b09df3a4dfa79fbe541086ea039fea51da69848bf59 not found: ID does not exist" containerID="7e1eacd4620335170cc67b09df3a4dfa79fbe541086ea039fea51da69848bf59" Apr 16 22:40:41.559102 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.559065 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e1eacd4620335170cc67b09df3a4dfa79fbe541086ea039fea51da69848bf59"} err="failed to get container status \"7e1eacd4620335170cc67b09df3a4dfa79fbe541086ea039fea51da69848bf59\": rpc error: code = NotFound desc = could not find container \"7e1eacd4620335170cc67b09df3a4dfa79fbe541086ea039fea51da69848bf59\": container with ID starting with 7e1eacd4620335170cc67b09df3a4dfa79fbe541086ea039fea51da69848bf59 not found: ID does not exist" Apr 16 22:40:41.559102 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.559078 2562 scope.go:117] "RemoveContainer" containerID="4cf29feb42f9be1d505288a5d2f43302c1cd003ab017bf04f41805f58c1c2722" Apr 16 22:40:41.559316 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.559288 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cf29feb42f9be1d505288a5d2f43302c1cd003ab017bf04f41805f58c1c2722"} err="failed to get container status \"4cf29feb42f9be1d505288a5d2f43302c1cd003ab017bf04f41805f58c1c2722\": rpc error: code = NotFound desc = could not find container \"4cf29feb42f9be1d505288a5d2f43302c1cd003ab017bf04f41805f58c1c2722\": container with ID starting with 4cf29feb42f9be1d505288a5d2f43302c1cd003ab017bf04f41805f58c1c2722 not found: ID does not exist" Apr 16 22:40:41.559357 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.559322 2562 scope.go:117] "RemoveContainer" containerID="a2330f24f818e51c80b5b5cc28d4148fd6e2c560a3ed06466489f259e8e55713" Apr 16 22:40:41.559531 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.559515 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2330f24f818e51c80b5b5cc28d4148fd6e2c560a3ed06466489f259e8e55713"} err="failed to get container status \"a2330f24f818e51c80b5b5cc28d4148fd6e2c560a3ed06466489f259e8e55713\": rpc error: code = NotFound desc = could not find container \"a2330f24f818e51c80b5b5cc28d4148fd6e2c560a3ed06466489f259e8e55713\": container with ID starting with a2330f24f818e51c80b5b5cc28d4148fd6e2c560a3ed06466489f259e8e55713 not found: ID does not exist" Apr 16 22:40:41.559581 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.559531 2562 scope.go:117] "RemoveContainer" containerID="7e1eacd4620335170cc67b09df3a4dfa79fbe541086ea039fea51da69848bf59" Apr 16 22:40:41.559790 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.559772 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e1eacd4620335170cc67b09df3a4dfa79fbe541086ea039fea51da69848bf59"} err="failed to get container status \"7e1eacd4620335170cc67b09df3a4dfa79fbe541086ea039fea51da69848bf59\": rpc error: code = NotFound desc = could not find container \"7e1eacd4620335170cc67b09df3a4dfa79fbe541086ea039fea51da69848bf59\": container with ID starting with 7e1eacd4620335170cc67b09df3a4dfa79fbe541086ea039fea51da69848bf59 not found: ID does not exist" Apr 16 22:40:41.590729 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.590705 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6b8aeac7-5a6b-4221-bed7-1846d615d3cc" (UID: "6b8aeac7-5a6b-4221-bed7-1846d615d3cc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:40:41.616198 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.616173 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-kserve-provision-location\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:40:41.616198 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.616196 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-model-cache\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:40:41.616333 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.616206 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-tls-certs\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:40:41.616333 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.616217 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-home\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:40:41.616333 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.616226 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sqp55\" (UniqueName: \"kubernetes.io/projected/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-kube-api-access-sqp55\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:40:41.616333 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.616236 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b8aeac7-5a6b-4221-bed7-1846d615d3cc-dshm\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:40:41.765905 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.765872 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5"] Apr 16 22:40:41.772515 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:41.772489 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-55895d758-8xvx5"] Apr 16 22:40:42.130520 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:42.130473 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" path="/var/lib/kubelet/pods/6b8aeac7-5a6b-4221-bed7-1846d615d3cc/volumes" Apr 16 22:40:45.160112 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:45.160066 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" podUID="c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 16 22:40:55.160515 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:40:55.160469 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" podUID="c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 16 22:41:05.160243 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:41:05.160183 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" podUID="c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 16 22:41:15.160691 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:41:15.160637 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" podUID="c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 16 22:41:25.160079 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:41:25.160030 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" podUID="c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 16 22:41:35.159926 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:41:35.159881 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" podUID="c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 16 22:41:45.160294 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:41:45.160237 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" podUID="c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 16 22:41:55.160242 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:41:55.160187 2562 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" podUID="c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" containerName="main" probeResult="failure" output="Get \"https://10.132.0.52:8000/health\": dial tcp 10.132.0.52:8000: connect: connection refused" Apr 16 22:42:05.169348 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:05.169309 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:42:05.177350 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:05.177326 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:42:15.627103 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:15.627064 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk"] Apr 16 22:42:15.627702 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:15.627439 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" podUID="c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" containerName="main" containerID="cri-o://d4cf3de7591964e5200de4c273a6ed5cd2aff0fc71ab0d5ec2451f29807fcd8a" gracePeriod=30 Apr 16 22:42:31.006818 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:31.006784 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/main/0.log" Apr 16 22:42:31.026057 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:31.026026 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/storage-initializer/0.log" Apr 16 22:42:32.018194 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:32.018160 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/main/0.log" Apr 16 22:42:32.028097 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:32.028074 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/storage-initializer/0.log" Apr 16 22:42:33.039341 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:33.039310 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/main/0.log" Apr 16 22:42:33.047195 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:33.047170 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/storage-initializer/0.log" Apr 16 22:42:34.038119 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:34.038093 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/main/0.log" Apr 16 22:42:34.045158 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:34.045137 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/storage-initializer/0.log" Apr 16 22:42:35.029928 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:35.029894 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/main/0.log" Apr 16 22:42:35.037581 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:35.037557 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/storage-initializer/0.log" Apr 16 22:42:35.993120 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:35.993089 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/main/0.log" Apr 16 22:42:36.001542 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:36.001515 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/storage-initializer/0.log" Apr 16 22:42:36.983076 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:36.983048 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/main/0.log" Apr 16 22:42:36.991182 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:36.991155 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/storage-initializer/0.log" Apr 16 22:42:38.008208 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:38.008174 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/main/0.log" Apr 16 22:42:38.015383 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:38.015360 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/storage-initializer/0.log" Apr 16 22:42:39.027846 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:39.027814 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/main/0.log" Apr 16 22:42:39.035587 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:39.035561 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/storage-initializer/0.log" Apr 16 22:42:40.041780 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:40.041747 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/main/0.log" Apr 16 22:42:40.054067 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:40.054041 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/storage-initializer/0.log" Apr 16 22:42:41.059886 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:41.059849 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/main/0.log" Apr 16 22:42:41.070066 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:41.070034 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/storage-initializer/0.log" Apr 16 22:42:42.081040 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:42.081008 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/main/0.log" Apr 16 22:42:42.089823 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:42.089797 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/storage-initializer/0.log" Apr 16 22:42:43.063357 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:43.063321 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/main/0.log" Apr 16 22:42:43.071884 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:43.071855 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/storage-initializer/0.log" Apr 16 22:42:44.045701 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:44.045673 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/main/0.log" Apr 16 22:42:44.056173 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:44.056147 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-bf495787-qshsk_c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/storage-initializer/0.log" Apr 16 22:42:44.975242 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:44.975208 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-lsndq_c4dc88be-3703-4f37-817c-53ef9e2bd820/discovery/0.log" Apr 16 22:42:45.818540 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.818504 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-lsndq_c4dc88be-3703-4f37-817c-53ef9e2bd820/discovery/0.log" Apr 16 22:42:45.869002 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.868982 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:42:45.922012 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.921940 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg8l7\" (UniqueName: \"kubernetes.io/projected/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-kube-api-access-sg8l7\") pod \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " Apr 16 22:42:45.922121 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.922016 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-dshm\") pod \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " Apr 16 22:42:45.922121 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.922059 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-tls-certs\") pod \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " Apr 16 22:42:45.922121 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.922085 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-model-cache\") pod \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " Apr 16 22:42:45.922121 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.922109 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-kserve-provision-location\") pod \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " Apr 16 22:42:45.922263 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.922177 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-home\") pod \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\" (UID: \"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8\") " Apr 16 22:42:45.922422 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.922395 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-model-cache" (OuterVolumeSpecName: "model-cache") pod "c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" (UID: "c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:42:45.922537 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.922521 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-home" (OuterVolumeSpecName: "home") pod "c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" (UID: "c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:42:45.924675 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.924648 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-dshm" (OuterVolumeSpecName: "dshm") pod "c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" (UID: "c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:42:45.924837 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.924783 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" (UID: "c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:42:45.925282 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.925260 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-kube-api-access-sg8l7" (OuterVolumeSpecName: "kube-api-access-sg8l7") pod "c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" (UID: "c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8"). InnerVolumeSpecName "kube-api-access-sg8l7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:42:45.952431 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.952398 2562 generic.go:358] "Generic (PLEG): container finished" podID="c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" containerID="d4cf3de7591964e5200de4c273a6ed5cd2aff0fc71ab0d5ec2451f29807fcd8a" exitCode=137 Apr 16 22:42:45.952550 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.952472 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" Apr 16 22:42:45.952550 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.952483 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" event={"ID":"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8","Type":"ContainerDied","Data":"d4cf3de7591964e5200de4c273a6ed5cd2aff0fc71ab0d5ec2451f29807fcd8a"} Apr 16 22:42:45.952550 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.952525 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk" event={"ID":"c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8","Type":"ContainerDied","Data":"d3dda249f08f11b2418f06d14cfa718e0dcccee3335cd272edf78be570514e7f"} Apr 16 22:42:45.952550 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.952542 2562 scope.go:117] "RemoveContainer" containerID="d4cf3de7591964e5200de4c273a6ed5cd2aff0fc71ab0d5ec2451f29807fcd8a" Apr 16 22:42:45.972431 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.972418 2562 scope.go:117] "RemoveContainer" containerID="66475f8421a2e08144ae64376eb1544249a9581bd6e751c53085f13a89f3568a" Apr 16 22:42:45.982356 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:45.982325 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" (UID: "c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:42:46.023765 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:46.023730 2562 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-home\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:42:46.023765 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:46.023762 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sg8l7\" (UniqueName: \"kubernetes.io/projected/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-kube-api-access-sg8l7\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:42:46.023940 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:46.023773 2562 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-dshm\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:42:46.023940 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:46.023783 2562 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-tls-certs\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:42:46.023940 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:46.023792 2562 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-model-cache\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:42:46.023940 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:46.023800 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8-kserve-provision-location\") on node \"ip-10-0-135-106.ec2.internal\" DevicePath \"\"" Apr 16 22:42:46.037125 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:46.037102 2562 scope.go:117] "RemoveContainer" containerID="d4cf3de7591964e5200de4c273a6ed5cd2aff0fc71ab0d5ec2451f29807fcd8a" Apr 16 22:42:46.037468 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:42:46.037440 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4cf3de7591964e5200de4c273a6ed5cd2aff0fc71ab0d5ec2451f29807fcd8a\": container with ID starting with d4cf3de7591964e5200de4c273a6ed5cd2aff0fc71ab0d5ec2451f29807fcd8a not found: ID does not exist" containerID="d4cf3de7591964e5200de4c273a6ed5cd2aff0fc71ab0d5ec2451f29807fcd8a" Apr 16 22:42:46.037533 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:46.037477 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4cf3de7591964e5200de4c273a6ed5cd2aff0fc71ab0d5ec2451f29807fcd8a"} err="failed to get container status \"d4cf3de7591964e5200de4c273a6ed5cd2aff0fc71ab0d5ec2451f29807fcd8a\": rpc error: code = NotFound desc = could not find container \"d4cf3de7591964e5200de4c273a6ed5cd2aff0fc71ab0d5ec2451f29807fcd8a\": container with ID starting with d4cf3de7591964e5200de4c273a6ed5cd2aff0fc71ab0d5ec2451f29807fcd8a not found: ID does not exist" Apr 16 22:42:46.037533 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:46.037497 2562 scope.go:117] "RemoveContainer" containerID="66475f8421a2e08144ae64376eb1544249a9581bd6e751c53085f13a89f3568a" Apr 16 22:42:46.037812 ip-10-0-135-106 kubenswrapper[2562]: E0416 22:42:46.037790 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66475f8421a2e08144ae64376eb1544249a9581bd6e751c53085f13a89f3568a\": container with ID starting with 66475f8421a2e08144ae64376eb1544249a9581bd6e751c53085f13a89f3568a not found: ID does not exist" containerID="66475f8421a2e08144ae64376eb1544249a9581bd6e751c53085f13a89f3568a" Apr 16 22:42:46.037895 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:46.037822 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66475f8421a2e08144ae64376eb1544249a9581bd6e751c53085f13a89f3568a"} err="failed to get container status \"66475f8421a2e08144ae64376eb1544249a9581bd6e751c53085f13a89f3568a\": rpc error: code = NotFound desc = could not find container \"66475f8421a2e08144ae64376eb1544249a9581bd6e751c53085f13a89f3568a\": container with ID starting with 66475f8421a2e08144ae64376eb1544249a9581bd6e751c53085f13a89f3568a not found: ID does not exist" Apr 16 22:42:46.270238 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:46.270161 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk"] Apr 16 22:42:46.275428 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:46.275402 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-bf495787-qshsk"] Apr 16 22:42:46.636799 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:46.636768 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-22b7x_471e9843-40f5-4d9c-a5ca-656553b34579/manager/0.log" Apr 16 22:42:46.690007 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:46.689976 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-88ww9_34a1ee5a-41c2-4f38-a715-36a1e3816faf/manager/0.log" Apr 16 22:42:46.714068 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:46.714043 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-wrrnt_9c333486-f7ee-42fc-8009-f0ee787db97a/manager/0.log" Apr 16 22:42:48.130882 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:48.130840 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" path="/var/lib/kubelet/pods/c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8/volumes" Apr 16 22:42:51.729039 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:51.729006 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9b5sq_419df959-4512-4006-ba6a-cca963743f66/global-pull-secret-syncer/0.log" Apr 16 22:42:51.786472 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:51.786443 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-8nkjl_7073fe4e-234e-40a7-b337-e387d20bd403/konnectivity-agent/0.log" Apr 16 22:42:51.906534 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:51.906514 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-106.ec2.internal_477d8d5a68adc15182b0ab0c3cde7f73/haproxy/0.log" Apr 16 22:42:55.699274 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:55.699247 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-22b7x_471e9843-40f5-4d9c-a5ca-656553b34579/manager/0.log" Apr 16 22:42:55.825986 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:55.825954 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-88ww9_34a1ee5a-41c2-4f38-a715-36a1e3816faf/manager/0.log" Apr 16 22:42:55.888211 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:55.888178 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-wrrnt_9c333486-f7ee-42fc-8009-f0ee787db97a/manager/0.log" Apr 16 22:42:56.873758 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:56.873728 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f60918ff-7aa3-4049-8c22-166f6ccb9eaf/alertmanager/0.log" Apr 16 22:42:56.898401 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:56.898379 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f60918ff-7aa3-4049-8c22-166f6ccb9eaf/config-reloader/0.log" Apr 16 22:42:56.928480 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:56.928462 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f60918ff-7aa3-4049-8c22-166f6ccb9eaf/kube-rbac-proxy-web/0.log" Apr 16 22:42:56.949489 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:56.949463 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f60918ff-7aa3-4049-8c22-166f6ccb9eaf/kube-rbac-proxy/0.log" Apr 16 22:42:56.973437 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:56.973417 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f60918ff-7aa3-4049-8c22-166f6ccb9eaf/kube-rbac-proxy-metric/0.log" Apr 16 22:42:56.994469 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:56.994446 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f60918ff-7aa3-4049-8c22-166f6ccb9eaf/prom-label-proxy/0.log" Apr 16 22:42:57.019871 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:57.019853 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f60918ff-7aa3-4049-8c22-166f6ccb9eaf/init-config-reloader/0.log" Apr 16 22:42:57.172191 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:57.172111 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-b97qf_085d2591-737c-4156-a33c-8c6f9b307ada/monitoring-plugin/0.log" Apr 16 22:42:57.365006 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:57.364982 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtdtz_50c10cf9-25e9-4cb5-a882-5cb45721bda9/node-exporter/0.log" Apr 16 22:42:57.386176 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:57.386153 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtdtz_50c10cf9-25e9-4cb5-a882-5cb45721bda9/kube-rbac-proxy/0.log" Apr 16 22:42:57.408318 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:57.408289 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtdtz_50c10cf9-25e9-4cb5-a882-5cb45721bda9/init-textfile/0.log" Apr 16 22:42:57.788047 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:57.788015 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-d9b5c87-tgz2n_10d51dde-059f-426f-91fb-fce0490d41c3/telemeter-client/0.log" Apr 16 22:42:57.813397 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:57.813373 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-d9b5c87-tgz2n_10d51dde-059f-426f-91fb-fce0490d41c3/reload/0.log" Apr 16 22:42:57.844861 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:42:57.844842 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-d9b5c87-tgz2n_10d51dde-059f-426f-91fb-fce0490d41c3/kube-rbac-proxy/0.log" Apr 16 22:43:00.405845 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.405817 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-nc9dz_92f991e3-505f-4a83-870d-ff28b1ed3fad/download-server/0.log" Apr 16 22:43:00.821149 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.821111 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l"] Apr 16 22:43:00.821637 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.821588 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="llm-d-routing-sidecar" Apr 16 22:43:00.821637 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.821629 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="llm-d-routing-sidecar" Apr 16 22:43:00.821861 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.821657 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" containerName="storage-initializer" Apr 16 22:43:00.821861 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.821667 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" containerName="storage-initializer" Apr 16 22:43:00.821861 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.821676 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="storage-initializer" Apr 16 22:43:00.821861 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.821684 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="storage-initializer" Apr 16 22:43:00.821861 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.821698 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" containerName="main" Apr 16 22:43:00.821861 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.821706 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" containerName="main" Apr 16 22:43:00.821861 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.821720 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="main" Apr 16 22:43:00.821861 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.821727 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="main" Apr 16 22:43:00.821861 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.821819 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="main" Apr 16 22:43:00.821861 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.821833 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b8aeac7-5a6b-4221-bed7-1846d615d3cc" containerName="llm-d-routing-sidecar" Apr 16 22:43:00.821861 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.821839 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3c8d9b1-a7c7-4edd-99c6-6cf2e73020e8" containerName="main" Apr 16 22:43:00.825099 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.825080 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:00.827151 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.827136 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6fvlv\"/\"openshift-service-ca.crt\"" Apr 16 22:43:00.827376 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.827357 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6fvlv\"/\"default-dockercfg-t4h5p\"" Apr 16 22:43:00.827985 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.827969 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6fvlv\"/\"kube-root-ca.crt\"" Apr 16 22:43:00.831419 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.831398 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l"] Apr 16 22:43:00.932484 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.932461 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2ef77791-8b01-4b83-95cd-21bbffe61bf0-proc\") pod \"perf-node-gather-daemonset-fnb8l\" (UID: \"2ef77791-8b01-4b83-95cd-21bbffe61bf0\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:00.932628 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.932493 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ef77791-8b01-4b83-95cd-21bbffe61bf0-lib-modules\") pod \"perf-node-gather-daemonset-fnb8l\" (UID: \"2ef77791-8b01-4b83-95cd-21bbffe61bf0\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:00.932628 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.932518 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2ef77791-8b01-4b83-95cd-21bbffe61bf0-podres\") pod \"perf-node-gather-daemonset-fnb8l\" (UID: \"2ef77791-8b01-4b83-95cd-21bbffe61bf0\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:00.932714 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.932655 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2ef77791-8b01-4b83-95cd-21bbffe61bf0-sys\") pod \"perf-node-gather-daemonset-fnb8l\" (UID: \"2ef77791-8b01-4b83-95cd-21bbffe61bf0\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:00.932714 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:00.932689 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5n9c\" (UniqueName: \"kubernetes.io/projected/2ef77791-8b01-4b83-95cd-21bbffe61bf0-kube-api-access-g5n9c\") pod \"perf-node-gather-daemonset-fnb8l\" (UID: \"2ef77791-8b01-4b83-95cd-21bbffe61bf0\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:01.033852 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:01.033829 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2ef77791-8b01-4b83-95cd-21bbffe61bf0-proc\") pod \"perf-node-gather-daemonset-fnb8l\" (UID: \"2ef77791-8b01-4b83-95cd-21bbffe61bf0\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:01.034007 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:01.033859 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ef77791-8b01-4b83-95cd-21bbffe61bf0-lib-modules\") pod \"perf-node-gather-daemonset-fnb8l\" (UID: \"2ef77791-8b01-4b83-95cd-21bbffe61bf0\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:01.034007 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:01.033876 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2ef77791-8b01-4b83-95cd-21bbffe61bf0-podres\") pod \"perf-node-gather-daemonset-fnb8l\" (UID: \"2ef77791-8b01-4b83-95cd-21bbffe61bf0\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:01.034007 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:01.033934 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/2ef77791-8b01-4b83-95cd-21bbffe61bf0-proc\") pod \"perf-node-gather-daemonset-fnb8l\" (UID: \"2ef77791-8b01-4b83-95cd-21bbffe61bf0\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:01.034007 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:01.033937 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2ef77791-8b01-4b83-95cd-21bbffe61bf0-sys\") pod \"perf-node-gather-daemonset-fnb8l\" (UID: \"2ef77791-8b01-4b83-95cd-21bbffe61bf0\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:01.034007 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:01.033981 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5n9c\" (UniqueName: \"kubernetes.io/projected/2ef77791-8b01-4b83-95cd-21bbffe61bf0-kube-api-access-g5n9c\") pod \"perf-node-gather-daemonset-fnb8l\" (UID: \"2ef77791-8b01-4b83-95cd-21bbffe61bf0\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:01.034007 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:01.033995 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/2ef77791-8b01-4b83-95cd-21bbffe61bf0-podres\") pod \"perf-node-gather-daemonset-fnb8l\" (UID: \"2ef77791-8b01-4b83-95cd-21bbffe61bf0\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:01.034310 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:01.034015 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2ef77791-8b01-4b83-95cd-21bbffe61bf0-sys\") pod \"perf-node-gather-daemonset-fnb8l\" (UID: \"2ef77791-8b01-4b83-95cd-21bbffe61bf0\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:01.034310 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:01.034092 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ef77791-8b01-4b83-95cd-21bbffe61bf0-lib-modules\") pod \"perf-node-gather-daemonset-fnb8l\" (UID: \"2ef77791-8b01-4b83-95cd-21bbffe61bf0\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:01.041025 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:01.041000 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5n9c\" (UniqueName: \"kubernetes.io/projected/2ef77791-8b01-4b83-95cd-21bbffe61bf0-kube-api-access-g5n9c\") pod \"perf-node-gather-daemonset-fnb8l\" (UID: \"2ef77791-8b01-4b83-95cd-21bbffe61bf0\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:01.136254 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:01.136187 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:01.258403 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:01.258378 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l"] Apr 16 22:43:01.259767 ip-10-0-135-106 kubenswrapper[2562]: W0416 22:43:01.259731 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2ef77791_8b01_4b83_95cd_21bbffe61bf0.slice/crio-9e8477066b17d9a107c4ee1412f374d0da064963199e3b9a2da3ea2e52c373c2 WatchSource:0}: Error finding container 9e8477066b17d9a107c4ee1412f374d0da064963199e3b9a2da3ea2e52c373c2: Status 404 returned error can't find the container with id 9e8477066b17d9a107c4ee1412f374d0da064963199e3b9a2da3ea2e52c373c2 Apr 16 22:43:01.261342 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:01.261326 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:43:01.682597 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:01.682572 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zprfg_896dfd17-7377-42ff-b2c1-0ff2bbb1909a/dns/0.log" Apr 16 22:43:01.702043 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:01.702021 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zprfg_896dfd17-7377-42ff-b2c1-0ff2bbb1909a/kube-rbac-proxy/0.log" Apr 16 22:43:01.745322 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:01.745300 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-l2b5r_c19bbb1b-93c9-40fc-9dc8-bc5463213a6d/dns-node-resolver/0.log" Apr 16 22:43:02.021944 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:02.021862 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" event={"ID":"2ef77791-8b01-4b83-95cd-21bbffe61bf0","Type":"ContainerStarted","Data":"499db2d10d6dab2bfb969f034740695f4f05759c08544a9c45f6f18b4fb96a51"} Apr 16 22:43:02.021944 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:02.021904 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" event={"ID":"2ef77791-8b01-4b83-95cd-21bbffe61bf0","Type":"ContainerStarted","Data":"9e8477066b17d9a107c4ee1412f374d0da064963199e3b9a2da3ea2e52c373c2"} Apr 16 22:43:02.021944 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:02.021922 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:02.039905 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:02.039858 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" podStartSLOduration=2.039842554 podStartE2EDuration="2.039842554s" podCreationTimestamp="2026-04-16 22:43:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:43:02.037079762 +0000 UTC m=+1774.519598947" watchObservedRunningTime="2026-04-16 22:43:02.039842554 +0000 UTC m=+1774.522361756" Apr 16 22:43:02.257968 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:02.257945 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4jzbz_d260be2e-0541-4595-95d1-cf52b077b22b/node-ca/0.log" Apr 16 22:43:03.115673 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:03.115646 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-lsndq_c4dc88be-3703-4f37-817c-53ef9e2bd820/discovery/0.log" Apr 16 22:43:03.697412 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:03.697382 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xg95t_262b53c6-89e8-4fcb-9d2d-6de1c03648ad/serve-healthcheck-canary/0.log" Apr 16 22:43:04.245183 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:04.245154 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6wx9g_8d8e91a5-19e6-40c2-b353-674e0838577c/kube-rbac-proxy/0.log" Apr 16 22:43:04.265428 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:04.265406 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6wx9g_8d8e91a5-19e6-40c2-b353-674e0838577c/exporter/0.log" Apr 16 22:43:04.286984 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:04.286964 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6wx9g_8d8e91a5-19e6-40c2-b353-674e0838577c/extractor/0.log" Apr 16 22:43:06.982347 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:06.982317 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-rh468_f5e8b5a8-1a99-4a2e-aaba-ee4be6a17ed3/openshift-lws-operator/0.log" Apr 16 22:43:07.534544 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:07.534510 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-5954b95474-bjhk4_b00a2301-8423-4bdc-b11e-4c1ccd6034c8/manager/0.log" Apr 16 22:43:07.552391 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:07.552366 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-zm8l6_e2f0453c-f168-46f3-9c2b-6a1250bc1db5/server/0.log" Apr 16 22:43:07.781905 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:07.781875 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-ftrm9_ff881812-87ac-4176-bad1-2d5b98e46069/manager/0.log" Apr 16 22:43:07.828237 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:07.828173 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-pl5vk_67da3a4d-636d-4917-9c9a-1e717a8394bb/seaweedfs/0.log" Apr 16 22:43:08.036013 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:08.035985 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-fnb8l" Apr 16 22:43:13.713746 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:13.713719 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6f9g5_19b43ea6-fab0-42f9-83dc-7b9ced78d6fa/kube-multus-additional-cni-plugins/0.log" Apr 16 22:43:13.735094 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:13.735065 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6f9g5_19b43ea6-fab0-42f9-83dc-7b9ced78d6fa/egress-router-binary-copy/0.log" Apr 16 22:43:13.755511 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:13.755488 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6f9g5_19b43ea6-fab0-42f9-83dc-7b9ced78d6fa/cni-plugins/0.log" Apr 16 22:43:13.775495 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:13.775476 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6f9g5_19b43ea6-fab0-42f9-83dc-7b9ced78d6fa/bond-cni-plugin/0.log" Apr 16 22:43:13.794685 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:13.794665 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6f9g5_19b43ea6-fab0-42f9-83dc-7b9ced78d6fa/routeoverride-cni/0.log" Apr 16 22:43:13.814885 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:13.814869 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6f9g5_19b43ea6-fab0-42f9-83dc-7b9ced78d6fa/whereabouts-cni-bincopy/0.log" Apr 16 22:43:13.839107 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:13.839077 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6f9g5_19b43ea6-fab0-42f9-83dc-7b9ced78d6fa/whereabouts-cni/0.log" Apr 16 22:43:14.198258 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:14.198234 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qcw82_bff6b952-968b-4fe3-a43f-333dead963bc/kube-multus/0.log" Apr 16 22:43:14.223772 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:14.223749 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4zqvj_ef0b8b85-4299-4164-b2f4-ae06377db331/network-metrics-daemon/0.log" Apr 16 22:43:14.245130 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:14.245111 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4zqvj_ef0b8b85-4299-4164-b2f4-ae06377db331/kube-rbac-proxy/0.log" Apr 16 22:43:15.372569 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:15.372537 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fwm7d_c068e6b8-2d0c-45f9-a80f-87d043b56b89/ovn-controller/0.log" Apr 16 22:43:15.396202 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:15.396178 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fwm7d_c068e6b8-2d0c-45f9-a80f-87d043b56b89/ovn-acl-logging/0.log" Apr 16 22:43:15.413185 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:15.413160 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fwm7d_c068e6b8-2d0c-45f9-a80f-87d043b56b89/kube-rbac-proxy-node/0.log" Apr 16 22:43:15.434233 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:15.434209 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fwm7d_c068e6b8-2d0c-45f9-a80f-87d043b56b89/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 22:43:15.452614 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:15.452578 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fwm7d_c068e6b8-2d0c-45f9-a80f-87d043b56b89/northd/0.log" Apr 16 22:43:15.473103 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:15.473082 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fwm7d_c068e6b8-2d0c-45f9-a80f-87d043b56b89/nbdb/0.log" Apr 16 22:43:15.492741 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:15.492719 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fwm7d_c068e6b8-2d0c-45f9-a80f-87d043b56b89/sbdb/0.log" Apr 16 22:43:15.604968 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:15.604944 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fwm7d_c068e6b8-2d0c-45f9-a80f-87d043b56b89/ovnkube-controller/0.log" Apr 16 22:43:16.996534 ip-10-0-135-106 kubenswrapper[2562]: I0416 22:43:16.996423 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-krggd_d78565aa-9f67-4043-a21a-fe0e9c37b4c3/network-check-target-container/0.log"