Please use this identifier to cite or link to this item:
Author(s): Lopes, B.
Catarino, S.
Souto, N.
Dinis, R.
Cercas, F.
Date: 2018
Title: Robust joint synchronization and channel estimation approach for frequency-selective environments
Volume: 6
Pages: 53180 - 53190
ISSN: 2169-3536
DOI (Digital Object Identifier): 10.1109/ACCESS.2018.2871060
Keywords: Channel estimation
Compressive sensing
Sparse signal recovery
Time synchronization
Abstract: Supporting spontaneous low-latency machine type communications requires fast synchronization and channel estimation at the receiver. The problems of synchronizing the received frame and estimating the channel coefficients are often addressed separately with the later one relying on accurate timing acquisition. While these conventional approaches can be adequate in flat fading environments, time dispersive channels can have a negative impact on both tasks and severely degrade the performance of the receiver. To circumvent this large degradation, in this paper we consider the use of a sparse based reconstruction approach for joint timing synchronization and channel estimation by formulating the problem in a form that is closely related to Compressive Sensing(CS) framework. Using modified versions of well-known sparse reconstruction techniques, which can take into account the additional signal structure in addition to sparsity, it is shown through numerical simulations that, even with short training sequences, excellent timing synchronization and channel estimation performance can be achieved, both in single user and multiuser scenarios.
Peerreviewed: yes
Access type: Open Access
Appears in Collections:IT-RI - Artigos em revistas científicas internacionais com arbitragem científica

Files in This Item:
File Description SizeFormat 
JointTimingChanEstim_IEEEAccess_published.pdfVersão Editora5,5 MBAdobe PDFView/Open

FacebookTwitterDeliciousLinkedInDiggGoogle BookmarksMySpaceOrkut
Formato BibTex mendeley Endnote Logotipo do DeGóis Logotipo do Orcid 

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.