KSIAM > Notice > (J-KSIAM) Volume 22 Number 1 (March 2018 issue) TOC

(J-KSIAM) Volume 22 Number 1 (March 2018 issue) TOC

작성일 : 18-03-21 14:31
(J-KSIAM) Volume 22 Number 1 (March 2018 issue) TOC
 글쓴이 : Kim, Junseok
조회 : 1,861  

Dear colleagues and researchers,

The Journal of the Korean Society for Industrial and Applied Mathematics (J-KSIAM) Volume 22 Number 1
(March 2018 issue) has been posed on http://www.ksiam.org/archive/ Aims and scope or other information
on the journal is available on the KSIAM website http://www.ksiam.org or http://www.ksiam.org/jksiam
The journal is one of Korea Citation Indexed (KCI) journals since 2007. Readers interested in the following
articles may download each of articles free of charge from our website and authors are encouraged to submit
a paper via the online submission site http://www.ksiam.org/jksiam/

Sincerely yours,

Minkyu Kwak, Editor-in-Chief
Zhiming Chen, June-Yub Lee, Tao Tang, Associate Editors-In-Chief
Jin Yeon Cho, Junseok Kim, Managing Editors


JKSIAM-v22n1 pp001-013
An Optimal Boosting Algorithm based on Nonlinear Conjugate Gradient method
Jooyeon Choi, Bora Jeong, Yesom Park, Jiwon Seo, Chohong Min

Boosting, one of the most successful algorithms for supervised learning, searches the most accurate weighted sum
of weak classifiers. The search corresponds to a convex programming with non-negativity and affine constraint.
In this article, we propose a novel Conjugate Gradient algorithm with the Modified Polak-Ribiera-Polyak conjugate
direction. The convergence of the algorithm is proved and we report its successful applications to boosting.


JKSIAM-v22n1 pp015-028
Acceleration of Machine Learning Algorithms by Tchebychev Iteration Technique
Mikhail P. Levin

Recently Machine Learning algorithms are widely used to process Big Data in various applications and a lot of these
applications are executed in run time. Therefore the speed of Machine Learning algorithms is a critical issue in
these applications. However the most of modern iteration Machine Learning algorithms use a successive iteration
technique well-known in Numerical Linear Algebra. But this technique has a very low convergence, needs a lot of
iterations to get solution of considering problems and therefore a lot of time for processing even on modern
multi-core computers and clusters. Tchebychev iteration technique is well-known in Numerical Linear Algebra as an
attractive candidate to decrease the number of iterations in Machine Learning iteration algorithms and also to
decrease the running time of these algorithms those is very important especially in run time applications. In this paper
we consider the usage of Tchebychev iterations for acceleration of well-known K-Means and SVM (Support Vector Machine)
clustering algorithms in Machine Leaning. Some examples of usage of our approach on modern multi-core computers under
Apache Spark framework will be considered and discussed.


JKSIAM-v22n1 pp029-062
Stability of delay-distributed HIV infection models with multiple viral producer cells
A. M. Elaiw, E. Kh. Elnahary, A. M. Shehata, M. Abul-Ez

We investigate a class of HIV infection models with two kinds of target cells: CD4^+T cells and macrophages. We
incorporate three distributed time delays into the models. Moreover, we consider the effect of humoral immunity on
the dynamical behavior of the HIV. The viruses are produced from four types of infected cells: short-lived infected
CD4^+T cells, long-lived chronically infected CD4^+T cells, short-lived infected macrophages and long-lived chronically
infected macrophages. The drug efficacy is assumed to be different for the two types of target cells. The HIV-target
incidence rate is given by bilinear and saturation functional response while, for the third model, both HIV-target
incidence rate and neutralization rate of viruses are given by nonlinear general functions. We show that the solutions
of the proposed models are nonnegative and ultimately bounded. We derive two threshold parameters which fully determine
the positivity and stability of the three steady states of the models. Using Lyapunov functionals, we established the
global stability of the steady states of the models. The theoretical results are confirmed by numerical simulations.


JKSIAM-v22n1 pp063-074
Effect of perturbation in the solution of fractional neutral functional differential equations
Mohammed. S. Abdo, Satish. K. Panchal

In this paper, we study the initial value problem for neutral functional differential equations involving Caputo
fractional derivative of order alpha in (0,1) with infinite delay. Some sufficient conditions for the uniqueness
and continuous dependence of solutions are established by virtue of fractional calculus and Banach fixed point theorem.
Some results obtained showed that the solution was closely related to the conditions of delays and minor changes in the
problem. An example is provided to illustrate the main results.


JKSIAM-v22n1 pp075-089
Comparative study of numerical algorithms for the arithmetic Asian option
Jian Wang, Jungyup Ban, Seongjin Lee, Changwoo Yoo

This paper presents the numerical valuation of the arithmetic Asian option by using the operator-splitting method (OSM).
Since there is no closed-form solution for the arithmetic Asian option, finding a good numerical algorithm to value the
arithmetic Asian option is important. In this paper, we focus on a two-dimensional PDE. The OSM is famous for dealing with
plural-dimensional PDE using finite difference discretization. We provide a detailed numerical algorithm and compare
results with MCS method to show the performance of the method.