검색 전체 메뉴
PDF
맨 위로
OA 학술지
Implementation of High-Resolution Angle Estimator for an Unmanned Ground Vehicle
  • 비영리 CC BY-NC
  • 비영리 CC BY-NC
ABSTRACT
Implementation of High-Resolution Angle Estimator for an Unmanned Ground Vehicle
KEYWORD
Angle Estimation , Beam Space MUSIC , FMCW Radar , Spatial Smoothing
  • I. INTRODUCTION

    Radar has been developed in automotive sensor systems since the 1970s because of its long range, accuracy, and adaptability in all-weather conditions. Initially, the key idea of the automotive radar system was collision avoidance; however, many different applications of the vehicle radar system are now under development [1]. For example, convenience equipment, such as adaptive cruise control (ACC), has already been commercialized. Applications of the safety system for autonomous driving are the major topics of current radar research [1,2].

    One of the requirements for radar of an autonomous driving system is a high-resolution capability, which is very difficult for the small-sized automotive radar because angular resolution directly depends on the antenna aperture size. However, parameter estimation methods by an array antenna can overcome this limitation of size. Multiple signal classification (MUSIC) is a well-known array processing method for high-resolution angular estimation [3-5], and its applications are being extended to various areas [6-9]. In this method, angle resolution is independent of the aperture size with ideal assumptions, such as uncorrelated signals, high signal-to-noise ratio (SNR), and large samples [10,11].

    Unfortunately in practice, as in automotive radar systems, these assumptions cannot be satisfied because of the highly correlated signals, low SNR, and small number of snapshots. Research efforts have attempted to overcome these limitations; most of these are de-correlation methods based on preprocessing schemes, such as forward/backward (FB) averaging or spatial smoothing (SS) [12]. In another approach, beamspace MUSIC (BS-MUSIC) has been studied to improve radar performance in low SNR conditions. Several studies have reported that BS-MUSIC has advantages of low sensitivity to system errors, reduced resolution threshold, and improved performance in environments with spatially colored noise [13-17].

    In this paper, we focus on increasing the number of snapshots by applying sawtooth waveforms. We show that this waveform enhances the angular estimation capability of MUSIC in addition to its well-known advantages of removing the ambiguity of the targets and improving velocity resolution. Despite these advantages, sawtooth waveforms have not often been used because of their high computational power. However, due to advances in digital technology, such as the digital signal processor (DSP) and FPGA, we can now implement signal processing units, including sawtooth waveforms and FBSS BS-MUSIC, which is one of the high-resolution angle estimators.

    In cooperation with the Korea Agency for Defense Development, we developed a 77-GHz FMCW radar sensor as part of the sensor system for the defense unmanned ground vehicle (UGV) or robot, which is designed to run on unpaved or bumpy terrain, such as mountain roads. Fig. 1 shows the operational concept of the radar in a UGV.

    Compared with commercial automotive radar, UGV radar must be capable of detecting slower targets in a harsh cluttered environment, covering wide angular sections in an azimuth up to 120°, and resolving targets with high angular resolution at the same time. In order to meet these requirements, we applied sawtooth waveforms in our process and designed the system using array antennas, an FPGA-based preprocessor for digital beam forming (DBF), and DSP for complex signal processing and high-resolution angle estimation.

    This paper is organized as follows: Section II describes our FMCW radar sensor and the signal processing method, including waveform design and high-resolution angle estimation. Section III shows the experimental results with simulated data as well as measured data. Conclusions are presented in Section IV.

    II. SYSTEM DESCRIPTION

    The radar system consists of antenna devices, transmitter/receiver, and the signal processing unit, as shown in Fig. 2. The transmitter/receiver has a homodyne structure. One broad illuminating transmitting beam and eight receiving beams overlap in the azimuth to yield a total azimuthal coverage of 60°. DBF was developed by using these eight received signals. The 3-dB beamwidth after DBF is about 15°. Some specifications of the transmitter/receiver are listed in Table 1.

    [Table 1.] Specifications of the transmitter/receiver

    label

    Specifications of the transmitter/receiver

       1. Signal Processing Unit

    Due to the impairments of the radio frequency (RF) frontend, the signal vector is distorted and degrades the resolution. Thus, the first process after converting the input to a digital signal is channel alignment by a pre-measured calibration matrix. Each data signal is transferred to the frequency domain through first FFT for extracting the range information according to the FMCW principle. Then, the corrected signals are transferred to the beam space via DBF. Because of the sawtooth waveform, a second FFT is necessary to extract velocity information; this will be explained in the next section. The resulting data from the second FFT, which is called the range-Doppler map, are used to detect the targets via the constant false alarm rate (CFAR) method. After this main detection process, MUSIC is performed with some information in advance. The whole block diagram is shown in Fig. 3(a).

    Since various computations of DBF, FFT, and MUSIC are included in signal processing chains, we designed our board using Virtex-5 FPGA (XC5VLX330; Xilinx Inc., San Jose, CA, USA) and DSP (TMS320C6455; Texas Instruments, Dallas, TX, USA) in order to make a real-time system. The preprocessing part including channel correction and DBF is implemented in FPGA, and the detection and FBSS BS-MUSIC are implemented in DSP. Fig. 3(b) shows our signal processing board.

       2. Design of the Waveform

    In the FMCW principle, targets are detected using the difference in frequency between transmitted and received signals, as shown in Fig. 4(a). This difference is due to the range and the velocity of targets. Each frequency difference by range (Δfr) and by velocity (Δfv) can be represented by

    image

    where FBW is the frequency bandwidth, T is the duration of one ramp, c is the speed of light, R and v are the range and velocity of each target, respectively, and fc is the carrier frequency, which is 77 GHz in this case. In general, FBW is only a few hundred MHz and T is a few milliseconds. This radar uses three or four ramps with different slopes to resolve signals from multiple targets and to calculate each range and velocity. However, inherently, measuring the sum or subtraction of frequencies can give the same value for several targets with a different range and velocity, including false targets.

    As the slope increases, usually by decreasing T, only the frequency difference by range (Δfr) increases in Eq. (1). Fig. 4(b) shows this waveform. If Δfv is less than 1/T, the frequency resolution Δfr becomes the major component within one ramp and Δfv appears in phase difference between ramps, the same as in stretch processing [18]. Therefore, Δfv can be detected using a burst of ramps through a second FFT. In this paper, this burst of ramps is called sawtooth waveforms.

    Although the sawtooth waveforms might require more elaborate design work and much more computational power, it improves performance of detection for multiple targets with more velocity resolution, which is necessary for slow-movingtarget detection in a cluttered environment. Moreover, this burst of ramps improves the performance of high-resolution estimation, a main interest in this paper, by giving more snapshots. The computational power is no longer challenging, considering the advances of FPGA and DSP technology.

    The parameters used in this paper are T = 33 seconds and FBW = 200 MHz, which give Δfv = 14.25 kHz and Δfr = 6.06 MHz for a target moving at 100 km/h in a 150 m range. The number of ramps in a burst is 108, which is the maximum number of snapshots.

       3. High-Resolution Estimation Algorithm: FBSS BS-MUSIC

    Consider a uniform linear array consisting of M identical sensors and K receiving signals that arrive at the linear array from directions θ1, θ2, …, θk. The signal model is given by

    image

    where s(t) is the K × 1 array replica vector, x(t) is the M × 1 array output vector, and n(t) is the M × 1 noise signal vector generated through the antenna and transmitter/receiver [12]. A represents an M × K direction matrix (M > K) with rank K and is given by

    image

    with a(θk) the direction vector associated with the arrival angle θk, i.e.,

    image

    If the beamforming matrix is WH, such as

    image

    then the covariance matrix is

    image

    where Us is the signal subspace consisting of K dominant eigenvectors and Un is the noise subspace consisting of the remaining (MK) eigenvectors. The BS-MUSIC algorithm shows that each peak in the angular spectrum is

    image

    and that it corresponds to a target direction of angle (DOA).

    We apply the FBSS method in [12] as the configuration of Fig. 5. The covariance matrix of the lth forward beams Rlf and that of the lth backward beams Rlb are given by

    image
    image
    image
    image

    where Ylf(t) stands for the output of the lth beam and Ylb(t) for the complex conjugate of the output of lth backward ignal for l = 1, 2,…, L, where L denotes the total number of forward beams.

    Rf and Rb in Eq. (10) represent the FB spatially smoothed covariance matrix as the mean of the FB beam covariance matrices.

    image

    The FB smoothed covariance matrix R' is the mean of Rf and Rb; i.e.,

    image

    Replacing R with R' in (6), the angular spectrum in (7) changes into

    image

    where U'n is the noise subspace of R' consisting of the (LK) eigenvectors, as in Eq. (6).

    If L, M, and the number of beams are appropriately chosen satisfying that L = MM0 + 1 ≥ K, the dimension of the signal subspace K is determined by the target numbers. Each value chosen in our system is M = 8, M0 = 6, and K = 2.

    III. RESULTS AND DISCUSSION

       1. Simulated Signals

    In order to see the effect of the number of snapshots, i.e., sawtooth waveforms, a set of received signals on arrays from two targets at -14° and -18.5° was simulated with random noise. Shown in Eq. (4), the signals were simulated based on the equal and omnidirectional element pattern. Although we verified the effect with white Gaussian noise, we used the colored noise in this simulation because it shows the effect clearly and is more realistic. The colored noises are generated by random noises passed through a filter. Based on the SNR, they were amplified and added to the signals.

    The normalized angular spectrums with different SNRs are shown in Fig. 6. As the number of snapshots increases, two peaks from the targets sharpen so that the two targets can be more easily resolved. In addition, the estimated angles approach the true values.

    The sharpness can be represented by the depth, which is defined as the value of the second peak from the valley between two peaks. The depth increases as the number of snapshots increases if the SNR is ≥5 dB, as shown in Fig. 7.

    In addition, the number of snapshots improves the error performance. Fig. 8 shows the angle errors with respect to the number of snapshots for different SNRs.

    As the number of snapshots increases up to 20 in this case, the estimated errors decrease and converge to a single value for each SNR. This bias error decreases as SNR increases. Moreover, the variance of errors is reduced continuously as the number of the snapshots increases for all SNRs.

       2. Real Signals

    In the experiment to verify the method, we used two corner reflectors, as shown in Fig. 9. The reflectors were located at 2.4° and 11.5°, respectively, from the bore sight of the radar. The angle difference was about 60% of the beamwidth.

    First, using a premeasured calibration matrix, each channel data record were aligned, as shown in Fig. 10.

    The resulting depth and errors are shown in Figs. 11 and 12, respectively, with respect to the number of snapshots. The estimated errors are reduced to less than 0.5° in bias and 0.1° in variance by using more snapshots.

    IV. CONCLUSIONS

    We implemented a real-time radar system of the high-resolution angle estimation for a UGV. We designed sawtooth waveforms and applied them with DBF and FBSS BS-MUSIC in signal processing units using FPGA and DSPs. The experimental results showed that two targets apart in less than 60 % of the beamwidth are resolved in real-time processing. We also showed that more snapshots from sawtooth waveforms improve the accuracy and the robustness for the high-resolution angle estimation

참고문헌
  • 1. Schneider M. 2005 "Automotive radar: status and trends," [in Proceedings of German Microwave Conference (GeMiC 2005)] P.144-147 google
  • 2. Rohling H., Meinecke M. M., Mott K., Urs L. 2001 "Research activities in automotive radar," [in Proceedings of the 4th International Kharkov Symposium on Physics and Engineering of Millimeter and Sub-Millimeter Waves] P.48-51 google
  • 3. Schmidt R. O. 1986 "Multiple emitter location and signal parameter estimation," [IEEE Transactions on Antenna and Propagation] Vol.34 P.276-280 google cross ref
  • 4. Wenig P., Schoor M., Gunther O., Yang B., Weigel R. 2007 "System design of a 77 GHz automotive radar sensor with superresolution DOA estimation," [in Proceedings of IEEE International Symposium on Signals, Systems and Electronics (ISSSE2007)] P.537-540 google
  • 5. Choi J., Park J., Yeom D. 2011 "High angular resolution estimation methods for vehicle FMCW radar," [in Proceedings of IEEE CIE International Conference on Radar] P.1868-1871 google
  • 6. Park S. H., Lee J. H., Kim K. T. 2012 "Performance analysis of the scenario-based construction method for real target ISAR recognition," [Progress in Electromagnetics Research] Vol.128 P.137-151 google cross ref
  • 7. Kim J. M., Lee O. K., Ye J. C. 2012 "Compressive MUSIC: revisiting the link between compressive sensing and array signal processing," [IEEE Transactions on Information Theory] Vol.58 P.278-301 google cross ref
  • 8. Ascione M., Buonanno A., D’Urso M., Angrisani L., Schiano Lo Moriello R. 2013 "A new measurement method based on music algorithm for through-the-wall detection of life signs," [IEEE Transactions on Instrumentation and Measurement] Vol.62 P.13-26 google cross ref
  • 9. Joh Y. D., Park W. K. 2013 "Structural behavior of the MUSIC-type algorithm for imaging perfectly conducting cracks," [Progress in Electromagnetics Research] Vol.138 P.211-226 google cross ref
  • 10. Schoor M., Yang B. 2007 "High-resolution angle estimation for an automotive FMCW radar sensor," [in Proceedings of the International Radar Symposium (IRS)] P.1-5 google
  • 11. Stoica P., Nehorai A. 1991 "Comparative performance study of element-space and beam-space MUSIC estimator," [Circuits, Systems, Signal Processing] Vol.10 P.285-292 google cross ref
  • 12. Pillai S. U., Kwon B. H. 1989 "Forward/backward spatial smoothing techniques for coherent signal identification," [IEEE Transactions on Acoustics, Speech and Signal Processing] Vol.37 P.8-15 google cross ref
  • 13. Lee H. B., Wengrovitz M. 1990 "Resolution threshold of beamspace MUSIC for two closely spaced emitters," [IEEE Transactions on Acoustics, Speech and Signal Processing] Vol.38 P.1545-1559 google cross ref
  • 14. Yang Y., Wan C., Sun C., Wang Q. 2003 "DOA estimation for coherent sources in beamspace using spatial smoothing," [in Proceedings of the 2003 Joint Conference of the 4th International Conference on Information, Communications and Signal Processing and 4th Pacific Rim Conference on Multimedia] P.1028-1032 google
  • 15. Xu X. L., Buckley K. M. 1989 "Statistical performance comparison of MUSIC in element-space and beam-space," [in Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP)] P.2124-2127 google
  • 16. Xu X. L., Buckley K. 1990 "A comparison of element and beam space spatial-spectrum estimation for multiple source clusters," [in Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP)] P.2643-2646 google
  • 17. Phaisal-Atsawasenee N., Suleesathira R. 2006 "Improved angular resolution of beamspace MUSIC for finding directions of coherent sources," [in Proceedings of the 1st International Symposium on Systems and Control in Aerospace and Astronautics (ISSCAA)] P.51-56 google
  • 18. Skolnik M. I. 2008 Radar Handbook google
OAK XML 통계
이미지 / 테이블
  • [ Fig. 1. ]  Operational concept of the unmanned ground vehicle.
    Operational concept of the unmanned ground vehicle.
  • [ Fig. 2. ]  System description.
    System description.
  • [ Table 1. ]  Specifications of the transmitter/receiver
    Specifications of the transmitter/receiver
  • [ Fig. 3. ]  Configuration of the signal processing unit. (a) Block diagram of the signal processing unit. (b) Processing board.
    Configuration of the signal processing unit. (a) Block diagram of the signal processing unit. (b) Processing board.
  • [ ] 
  • [ Fig. 4. ]  Shapes of waveforms. (a) Two up/down ramp. (b) Sawtooth with steep slope.
    Shapes of waveforms. (a) Two up/down ramp. (b) Sawtooth with steep slope.
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ ] 
  • [ Fig. 5. ]  The forward/backward spatial smoothing scheme.
    The forward/backward spatial smoothing scheme.
  • [ ] 
  • [ ] 
  • [ ] 
  • [ Fig. 6. ]  Normalized angular spectrum with a different number of snapshots. (a) SNR = 0 dB, (b) SNR = 5 dB, and (c) SNR = 10 dB.
    Normalized angular spectrum with a different number of snapshots. (a) SNR = 0 dB, (b) SNR = 5 dB, and (c) SNR = 10 dB.
  • [ Fig. 7. ]  Sharpness as the number of snapshots increases.
    Sharpness as the number of snapshots increases.
  • [ Fig. 8. ]  Angle estimation performance of estimated errors and variance of the errors.
    Angle estimation performance of estimated errors and variance of the errors.
  • [ Fig. 9. ]  Photograph of the radar and the corner reflectors.
    Photograph of the radar and the corner reflectors.
  • [ Fig. 10. ]  Date before and after the channel. (a) Raw data of arrays. (b) Aligned data.
    Date before and after the channel. (a) Raw data of arrays. (b) Aligned data.
  • [ Fig. 11. ]  Normalized angular spectrum and depth with respect to the number of snapshots.
    Normalized angular spectrum and depth with respect to the number of snapshots.
  • [ Fig. 12. ]  Angular error according to the number of snapshots.
    Angular error according to the number of snapshots.
(우)06579 서울시 서초구 반포대로 201(반포동)
Tel. 02-537-6389 | Fax. 02-590-0571 | 문의 : oak2014@korea.kr
Copyright(c) National Library of Korea. All rights reserved.