روش اجرای موازی حداکثر کردن انتظار برای آموزش مدل مخلوط گاوسی
PARALLEL IMPLEMENTATION OF EXPECTATIONMAXIMISATION ALGORITHM FOR THE TRAINING OF GAUSSIAN MIXTURE MODELS
نویسندگان |
این بخش تنها برای اعضا قابل مشاهده است ورودعضویت |
اطلاعات مجله |
thescipub.com |
سال انتشار |
2014 |
فرمت فایل |
PDF |
کد مقاله |
24213 |
پس از پرداخت آنلاین، فوراً لینک دانلود مقاله به شما نمایش داده می شود.
چکیده (انگلیسی):
Most machine learning algorithms need to handle large data sets. This feature often leads to limitations on
processing time and memory. The Expectation-Maximization (EM) is one of such algorithms, which is used
to train one of the most commonly used parametric statistical models, the Gaussian Mixture Models
(GMM). All steps of the algorithm are potentially parallelizable once they iterate over the entire data set. In
this study, we propose a parallel implementation of EM for training GMM using CUDA. Experiments are
performed with a UCI dataset and results show a speedup of 7 if compared to the sequential version. We
have also carried out modifications to the code in order to provide better access to global memory and
shared memory usage. We have achieved up to 56.4% of achieved occupancy, regardless the number of
Gaussians considered in the set of experiments.
کلمات کلیدی مقاله (فارسی):
حداکثر انتظار ، مدل مخلوط گاوسي ، سي يو دي اي
کلمات کلیدی مقاله (انگلیسی):
Keywords: Expectation-Maximization (EM), Gaussian Mixture Models (GMM), CUDA
پس از پرداخت آنلاین، فوراً لینک دانلود مقاله به شما نمایش داده می شود.