<![CDATA[MoneyScience: Research]]>
http://www.moneyscience.com/pg/blog-directory/research?view=rss
FinancialResearchFocushttps://feedburner.google.comhttp://www.moneyscience.com/pg/blog/arXiv/read/834295/unfolding-the-complexity-of-the-global-value-chain-strengths-and-entropy-in-the-singlelayer-multiplex-and-multilayer-international-trade-networks-arxiv180907407v1-physicssocphThu, 20 Sep 2018 19:36:58 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/ieMT1qhlqh4/unfolding-the-complexity-of-the-global-value-chain-strengths-and-entropy-in-the-singlelayer-multiplex-and-multilayer-international-trade-networks-arxiv180907407v1-physicssocph
<![CDATA[Unfolding the complexity of the global value chain: Strengths and entropy in the single-layer, multiplex, and multi-layer international trade networks. (arXiv:1809.07407v1 [physics.soc-ph])]]>The worldwide trade network has been widely studied through different data
sets and network representations with a view to better understanding
interactions among countries and products. Here, we investigate international
trade through the lenses of the single-layer, multiplex, and multi-layer
networks. We discuss differences among the three network frameworks in terms of
their relative advantages in capturing salient topological features of trade.
We draw on the World Input-Output Database to build the three networks. We then
uncover sources of heterogeneity in the way strength is allocated among
countries and transactions by computing the strength distribution and entropy
in each network. Additionally, we trace how entropy evolved, and show how the
observed peaks can be associated with the onset of the global economic
downturn. Findings suggest how more complex representations of trade, such as
the multi-layer network, enable us to disambiguate the distinct roles of intra-
and cross-industry transactions in driving the evolution of entropy at a more
aggregate level. We discuss our results and the implications of our comparative
analysis of networks for research on international trade and other empirical
domains across the natural and social sciences.
]]>834295http://www.moneyscience.com/pg/blog/arXiv/read/834295/unfolding-the-complexity-of-the-global-value-chain-strengths-and-entropy-in-the-singlelayer-multiplex-and-multilayer-international-trade-networks-arxiv180907407v1-physicssocphhttp://www.moneyscience.com/pg/blog/arXiv/read/834294/on-the-quasisure-superhedging-duality-with-frictions-arxiv180907516v1-qfinmfThu, 20 Sep 2018 19:36:56 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/pVYixtXFGSg/on-the-quasisure-superhedging-duality-with-frictions-arxiv180907516v1-qfinmf
<![CDATA[On the quasi-sure superhedging duality with frictions. (arXiv:1809.07516v1 [q-fin.MF])]]>We prove the superhedging duality for a discrete-time financial market with
proportional transaction costs under portfolio constraints and model
uncertainty. Frictions are modeled through solvency cones as in the original
model of [Kabanov, Y., Hedging and liquidation under transaction costs in
currency markets. Fin. Stoch., 3(2):237-248, 1999] adapted to the quasi-sure
setup of [Bouchard, B. and Nutz, M., Arbitrage and duality in nondominated
discrete-time models. Ann. Appl. Probab., 25(2):823-859, 2015]. Our results
hold under the condition of No Strict Arbitrage and under the efficient
friction hypothesis.
]]>834294http://www.moneyscience.com/pg/blog/arXiv/read/834294/on-the-quasisure-superhedging-duality-with-frictions-arxiv180907516v1-qfinmfhttp://www.moneyscience.com/pg/blog/arXiv/read/834293/insider-trading-with-penalties-arxiv180907545v1-qfintrThu, 20 Sep 2018 19:36:56 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/Sr_0AuXKqwc/insider-trading-with-penalties-arxiv180907545v1-qfintr
<![CDATA[Insider Trading with Penalties. (arXiv:1809.07545v1 [q-fin.TR])]]>We consider a one-period Kyle (1985) framework where the insider can be
subject to a penalty if she trades. We establish existence and uniqueness of
equilibrium for virtually any penalty function when noise is uniform. In
equilibrium, the demand of the insider and the price functions are in general
non-linear and remain analytically tractable because the expected price
function is linear. We use this result to investigate the trade off between
price efficiency and 'fairness': we consider a regulator that wants to minimise
post-trade standard deviation for a given level of uninformed traders' losses.
The minimisation is over the function space of penalties; for each possible
penalty, our existence and uniqueness theorem allows to define unambiguously
the post-trade standard deviation and the uninformed traders' losses that
prevail in equilibrium.Optimal penalties are characterized in closed-form. They
must increase quickly with the magnitude of the insider's order for small
orders and become flat for large orders: in cases where the fundamental
realizes at very high or very low values, the insider finds it optimal to trade
despite the high penalty. Although such trades-if they occur-are costly for
liquidity traders, they signal extreme events and therefore incorporate a lot
of information into prices. We generalize this result in two directions by
imposing a budget constraint on the regulator and considering the cases of
either non-pecuniary or pecuniary penalties. In the first case, we establish
that optimal penalties are a subset of the previously optimal penalties: the
patterns of equilibrium trade volumes and prices is unchanged. In the second
case, we also fully characterize the constrained efficient points and penalties
and show that new patterns emerge in the demand schedules of the insider trader
and the associated price functions.
]]>834293http://www.moneyscience.com/pg/blog/arXiv/read/834293/insider-trading-with-penalties-arxiv180907545v1-qfintrhttp://www.moneyscience.com/pg/blog/arXiv/read/834292/geometric-local-variance-gamma-model-arxiv180907727v1-qfinprThu, 20 Sep 2018 19:36:53 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/lVjc2AfgUm0/geometric-local-variance-gamma-model-arxiv180907727v1-qfinpr
<![CDATA[Geometric Local Variance Gamma model. (arXiv:1809.07727v1 [q-fin.PR])]]>This paper describes another extension of the Local Variance Gamma model
originally proposed by P. Carr in 2008, and then further elaborated on by Carr
and Nadtochiy, 2017 (CN2017), and Carr and Itkin, 2018 (CI2018). As compared
with the latest version of the model developed in CI2018 and called the ELVG
(the Expanded Local Variance Gamma model), here we provide two innovations.
First, in all previous papers the model was constructed based on a Gamma
time-changed {\it arithmetic} Brownian motion: with no drift in CI2017, and
with drift in CI2018, and the local variance to be a function of the spot level
only. In contrast, here we develop a {\it geometric} version of this model with
drift. Second, in CN2017 the model was calibrated to option smiles assuming the
local variance is a piecewise constant function of strike, while in CI2018 the
local variance is a piecewise linear} function of strike. In this paper we
consider 3 piecewise linear models: the local variance as a function of strike,
the local variance as function of log-strike, and the local volatility as a
function of strike (so, the local variance is a piecewise quadratic function of
strike). We show that for all these new constructions it is still possible to
derive an ordinary differential equation for the option price, which plays a
role of Dupire's equation for the standard local volatility model, and,
moreover, it can be solved in closed form. Finally, similar to CI2018, we show
that given multiple smiles the whole local variance/volatility surface can be
recovered which does not require solving any optimization problem. Instead, it
can be done term-by-term by solving a system of non-linear algebraic equations
for each maturity which is fast.
]]>834292http://www.moneyscience.com/pg/blog/arXiv/read/834292/geometric-local-variance-gamma-model-arxiv180907727v1-qfinprhttp://www.moneyscience.com/pg/blog/arXiv/read/834207/a-revisit-of-the-borch-rule-for-the-principalagent-risksharing-problem-arxiv180907040v1-qfinrmWed, 19 Sep 2018 19:37:01 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/QKB5WFhWHu8/a-revisit-of-the-borch-rule-for-the-principalagent-risksharing-problem-arxiv180907040v1-qfinrm
<![CDATA[A revisit of the Borch rule for the Principal-Agent Risk-Sharing problem. (arXiv:1809.07040v1 [q-fin.RM])]]>In this paper we provide a new approach to tackle the Principal-Agent
Risk-Sharing problem using optimal stochastic control technics. Our analysis
relies on an optimal decomposition of the expected utility of the Principal in
terms of the reservation utility of the Agent. In particular, this allows us to
derive the Borch rule as a necessary optimality condition for this
decomposition to hold, which sheds a new light on this economic concept. As a
by-product, this approach provides a class of risk-sharing plans that satisfy
the Borch rule; class to which the optimal plan belongs.
]]>834207http://www.moneyscience.com/pg/blog/arXiv/read/834207/a-revisit-of-the-borch-rule-for-the-principalagent-risksharing-problem-arxiv180907040v1-qfinrmhttp://www.moneyscience.com/pg/blog/arXiv/read/834206/complex-market-dynamics-in-the-light-of-random-matrix-theory-arxiv180907100v1-qfincpWed, 19 Sep 2018 19:37:01 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/tqQ8EGrRZsE/complex-market-dynamics-in-the-light-of-random-matrix-theory-arxiv180907100v1-qfincp
<![CDATA[Complex market dynamics in the light of random matrix theory. (arXiv:1809.07100v1 [q-fin.CP])]]>We present a brief overview of random matrix theory (RMT) with the objectives
of highlighting the computational results and applications in financial markets
as complex systems. An oft-encountered problem in computational finance is the
choice of an appropriate epoch over which the empirical cross-correlation
return matrix is computed. A long epoch would smoothen the fluctuations in the
return time series and suffers from non-stationarity, whereas a short epoch
results in noisy fluctuations in the return time series and the correlation
matrices turn out to be highly singular. An effective method to tackle this
issue is the use of the power mapping, where a non-linear distortion is applied
to a short epoch correlation matrix. The value of distortion parameter controls
the noise-suppression. The distortion also removes the degeneracy of zero
eigenvalues. Depending on the correlation structures, interesting properties of
the eigenvalue spectra are found. We simulate different correlated Wishart
matrices to compare the results with empirical return matrices computed using
the S&P 500 (USA) market data for the period 1985-2016. We also briefly review
two recent applications of RMT in financial stock markets: (i) Identification
of "market states" and long-term precursor to a critical state; (ii)
Characterization of catastrophic instabilities (market crashes).
]]>834206http://www.moneyscience.com/pg/blog/arXiv/read/834206/complex-market-dynamics-in-the-light-of-random-matrix-theory-arxiv180907100v1-qfincphttp://www.moneyscience.com/pg/blog/arXiv/read/834205/enabling-scientific-crowds-the-theory-of-enablers-for-crowdbased-scientific-investigation-arxiv180907195v1-qfincpWed, 19 Sep 2018 19:37:01 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/NuyfEfsyEqY/enabling-scientific-crowds-the-theory-of-enablers-for-crowdbased-scientific-investigation-arxiv180907195v1-qfincp
<![CDATA[Enabling Scientific Crowds: The Theory of Enablers for Crowd-Based Scientific Investigation. (arXiv:1809.07195v1 [q-fin.CP])]]>Evidence shows that in a significant number of cases the current methods of
research do not allow for reproducible and falsifiable procedures of scientific
investigation. As a consequence, the majority of critical decisions at all
levels, from personal investment choices to overreaching global policies, rely
on some variation of try-and-error and are mostly non-scientific by definition.
We lack transparency for procedures and evidence, proper explanation of market
events, predictability on effects, or identification of causes. There is no
clear demarcation of what is inherently scientific, and as a consequence, the
line between fake and genuine is blurred. This paper presents highlights of the
Theory of Enablers for Crowd-Based Scientific Investigation, or Theory of
Enablers for short. The Theory of Enablers assumes the use of a next-generation
investigative approach leveraging forces of human diversity, micro-specialized
crowds, and proper computer-assisted control methods associated with
accessibility, reproducibility, communication, and collaboration. This paper
defines the set of very specific cognitive and non-cognitive enablers for
crowd-based scientific investigation: methods of proof, large-scale
collaboration, and a domain-specific computational representation. These
enablers allow the application of procedures of structured scientific
investigation powered by crowds, a collective brain in which neurons are human
collaborators
]]>834205http://www.moneyscience.com/pg/blog/arXiv/read/834205/enabling-scientific-crowds-the-theory-of-enablers-for-crowdbased-scientific-investigation-arxiv180907195v1-qfincphttp://www.moneyscience.com/pg/blog/arXiv/read/834204/parameter-estimation-of-heavytailed-ar-model-with-missing-data-via-stochastic-em-arxiv180907203v1-statapWed, 19 Sep 2018 19:37:01 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/oCSxce8by0g/parameter-estimation-of-heavytailed-ar-model-with-missing-data-via-stochastic-em-arxiv180907203v1-statap
<![CDATA[Parameter Estimation of Heavy-Tailed AR Model with Missing Data via Stochastic EM. (arXiv:1809.07203v1 [stat.AP])]]>The autoregressive (AR) model is a widely used model to understand time
series data. Traditionally, the innovation noise of the AR is modeled as
Gaussian. However, many time series applications, for example, financial time
series data are non-Gaussian, therefore, the AR model with more general
heavy-tailed innovations are preferred. Another issue that frequently occurs in
time series is missing values, due to the system data record failure or
unexpected data loss. Although there are numerous works about Gaussian AR time
series with missing values, as far as we know, there does not exist any work
addressing the issue of missing data for the heavy-tailed AR model. In this
paper, we consider this issue for the first time, and propose an efficient
framework for the parameter estimation from incomplete heavy-tailed time series
based on the stochastic approximation expectation maximization (SAEM) coupled
with a Markov Chain Monte Carlo (MCMC) procedure. The proposed algorithm is
computationally cheap and easy to implement. The convergence of the proposed
algorithm to a stationary point of the observed data likelihood is rigorously
proved. Extensive simulations on synthetic and real datasets demonstrate the
efficacy of the proposed framework.
]]>834204http://www.moneyscience.com/pg/blog/arXiv/read/834204/parameter-estimation-of-heavytailed-ar-model-with-missing-data-via-stochastic-em-arxiv180907203v1-stataphttp://www.moneyscience.com/pg/blog/arXiv/read/834203/pricing-american-options-by-exercise-rate-optimization-arxiv180907300v1-qfincpWed, 19 Sep 2018 19:37:01 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/C8Z4s8R1BKg/pricing-american-options-by-exercise-rate-optimization-arxiv180907300v1-qfincp
<![CDATA[Pricing American Options by Exercise Rate Optimization. (arXiv:1809.07300v1 [q-fin.CP])]]>We present a novel method for the numerical pricing of American options based
on Monte Carlo simulation and optimization of exercise strategies. Previous
solutions to this problem either explicitly or implicitly determine so-called
optimal \emph{exercise regions}, which consist of points in time and space at
which the option is exercised. In contrast, our method determines
\emph{exercise rates} of randomized exercise strategies. We show that the
supremum of the corresponding stochastic optimization problem provides the
correct option price. By integrating analytically over the random exercise
decision, we obtain an objective function that is differentiable with respect
to perturbations of the exercise rate even for finitely many sample paths.
Starting in a neutral strategy with constant exercise rate then allows us to
globally optimize this function in a gradual manner. Numerical experiments on
vanilla put options in the multivariate Black--Scholes model and preliminary
theoretical analysis underline the efficiency of our method both with respect
to the number of time-discretization steps and the required number of degrees
of freedom in the parametrization of exercise rates. Finally, the flexibility
of our method is demonstrated by numerical experiments on max call options in
the Black--Scholes model and vanilla put options in Heston model and the
non-Markovian rough Bergomi model.
]]>834203http://www.moneyscience.com/pg/blog/arXiv/read/834203/pricing-american-options-by-exercise-rate-optimization-arxiv180907300v1-qfincphttp://www.moneyscience.com/pg/blog/arXiv/read/834202/transport-plans-with-domain-constraints-arxiv180404283v2-mathpr-updatedWed, 19 Sep 2018 19:36:58 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/S-XLTts8zDw/transport-plans-with-domain-constraints-arxiv180404283v2-mathpr-updated
<![CDATA[Transport plans with domain constraints. (arXiv:1804.04283v2 [math.PR] UPDATED)]]>Let $\Omega$ be one of $\X^{N+1},C[0,1],D[0,1]$: product of Polish spaces,
space of continuous functions from $[0,1]$ to $\mathbb{R}^d$, and space of RCLL
(right-continuous with left limits) functions from $[0,1]$ to $\mathbb{R}^d$,
respectively. We first consider the existence of a probability measure $P$ on
$\Omega$ such that $P$ has the given marginals $\alpha$ and $\beta$ and its
disintegration $P_x$ must be in some fixed $\Gamma(x) \subset \kP(\Omega)$,
where $\kP(\Omega)$ is the set of probability measures on $\Omega$. The main
application we have in mind is the martingale optimal transport problem when
the martingales are assumed to have bounded volatility/quadratic variation. We
show that such probability measure exists if and only if the $\alpha$ average
of the so-called $G$-expectation of bounded continuous functions with respect
to the measures in $\Gamma$ is less than their $\beta$ average.
read more...

]]>834202http://www.moneyscience.com/pg/blog/arXiv/read/834202/transport-plans-with-domain-constraints-arxiv180404283v2-mathpr-updatedhttp://www.moneyscience.com/pg/blog/arXiv/read/834201/weak-correlations-of-stocks-future-returns-arxiv180605160v2-qfinst-updatedWed, 19 Sep 2018 19:36:56 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/S2NOIPtGevY/weak-correlations-of-stocks-future-returns-arxiv180605160v2-qfinst-updated
<![CDATA[Weak Correlations of Stocks Future Returns. (arXiv:1806.05160v2 [q-fin.ST] UPDATED)]]>We analyze correlations among stock returns via a series of widely adopted
parameters which we refer to as explanatory variables. We subsequently exploit
the results to propose a long only quantitative adaptive technique to construct
a profitable portfolio of assets which exhibits minor drawdowns and higher
recoveries than both an equally weighted and an efficient frontier portfolio.
]]>834201http://www.moneyscience.com/pg/blog/arXiv/read/834201/weak-correlations-of-stocks-future-returns-arxiv180605160v2-qfinst-updatedhttp://www.moneyscience.com/pg/blog/arXiv/read/834098/liberal-radicalism-formal-rules-for-a-society-neutral-among-communities-arxiv180906421v1-econgnTue, 18 Sep 2018 19:36:52 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/bGO0_2bRAlo/liberal-radicalism-formal-rules-for-a-society-neutral-among-communities-arxiv180906421v1-econgn
<![CDATA[Liberal Radicalism: Formal Rules for a Society Neutral among Communities. (arXiv:1809.06421v1 [econ.GN])]]>We propose a design for philanthropic or publicly-funded seeding to allow
(near) optimal provision of a decentralized, self-organizing ecosystem of
public goods. The concept extends ideas from Quadratic Voting to a funding
mechanism for endogenous community formation. Individuals make public goods
contributions to projects of value to them. The amount received by the project
is (proportional to) the square of the sum of the square roots of contributions
received. Under the "standard model" this yields first best public goods
provision. Variations can limit the cost, help protect against collusion and
aid coordination. We discuss applications to campaign finance, open source
software ecosystems, news media finance and urban public projects. More
broadly, we offer a resolution to the classic liberal-communitarian debate in
political philosophy by providing neutral and non-authoritarian rules that
nonetheless support collective organization.
]]>834098http://www.moneyscience.com/pg/blog/arXiv/read/834098/liberal-radicalism-formal-rules-for-a-society-neutral-among-communities-arxiv180906421v1-econgnhttp://www.moneyscience.com/pg/blog/arXiv/read/834097/a-language-for-largescale-collaboration-in-economics-a-streamlined-computational-representation-of-financial-models-arxiv180906471v1-qfincpTue, 18 Sep 2018 19:36:52 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/dCDr29TS0DU/a-language-for-largescale-collaboration-in-economics-a-streamlined-computational-representation-of-financial-models-arxiv180906471v1-qfincp
<![CDATA[A Language for Large-Scale Collaboration in Economics: A Streamlined Computational Representation of Financial Models. (arXiv:1809.06471v1 [q-fin.CP])]]>This paper introduces Sigma, a domain-specific computational representation
for collaboration in large-scale for the field of economics. A computational
representation is not a programming language or a software platform. A
computational representation is a domain-specific representation system based
on three specific elements: facets, contributions, and constraints of data.
Facets are definable aspects that make up a subject or an object. Contributions
are shareable and formal evidence, carrying specific properties, and produced
as a result of a crowd-based scientific investigation. Constraints of data are
restrictions defining domain-specific rules of association between entities and
relationships. A computational representation serves as a layer of abstraction
that is required in order to define domain-specific concepts in computers, in a
way these concepts can be shared in a crowd for the purposes of a controlled
scientific investigation in large-scale by crowds. Facets, contributions, and
constraints of data are defined for any domain of knowledge by the application
of a generic set of inputs, procedural steps, and products called a
representational process. The application of this generic process to our domain
of knowledge, the field of economics, produces Sigma. Sigma is described in
this paper in terms of its three elements: facets (streaming, reactives,
distribution, and simulation), contributions (financial models, processors, and
endpoints), and constraints of data (configuration, execution, and simulation
meta-model). Each element of the generic representational process and the Sigma
computational representation is described and formalized in details.
]]>834097http://www.moneyscience.com/pg/blog/arXiv/read/834097/a-language-for-largescale-collaboration-in-economics-a-streamlined-computational-representation-of-financial-models-arxiv180906471v1-qfincphttp://www.moneyscience.com/pg/blog/arXiv/read/834096/the-distortion-principle-for-insurance-pricing-properties-identification-and-robustness-arxiv180906592v1-qfinrmTue, 18 Sep 2018 19:36:51 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/-rxYQLijDUc/the-distortion-principle-for-insurance-pricing-properties-identification-and-robustness-arxiv180906592v1-qfinrm
<![CDATA[The distortion principle for insurance pricing: properties, identification and robustness. (arXiv:1809.06592v1 [q-fin.RM])]]>Distortion (Denneberg 1990) is a well known premium calculation principle for
insurance contracts. In this paper, we study sensitivity properties of
distortion functionals w.r.t. the assumptions for risk aversion as well as
robustness w.r.t. ambiguity of the loss distribution. Ambiguity is measured by
the Wasserstein distance. We study variances of distances for probability
models and identify some worst case distributions. In addition to the direct
problem we also investigate the inverse problem, that is how to identify the
distortion density on the basis of observations of insurance premia.
]]>834096http://www.moneyscience.com/pg/blog/arXiv/read/834096/the-distortion-principle-for-insurance-pricing-properties-identification-and-robustness-arxiv180906592v1-qfinrmhttp://www.moneyscience.com/pg/blog/arXiv/read/834095/a-consistent-stochastic-model-of-the-term-structure-of-interest-rates-for-multiple-tenors-arxiv180906643v1-qfinprTue, 18 Sep 2018 19:36:51 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/h31Vgl87hZs/a-consistent-stochastic-model-of-the-term-structure-of-interest-rates-for-multiple-tenors-arxiv180906643v1-qfinpr
<![CDATA[A Consistent Stochastic Model of the Term Structure of Interest Rates for Multiple Tenors. (arXiv:1809.06643v1 [q-fin.PR])]]>Explicitly taking into account the risk incurred when borrowing at a shorter
tenor versus lending at a longer tenor ("roll-over risk"), we construct a
stochastic model framework for the term structure of interest rates in which a
frequency basis (i.e. a spread applied to one leg of a swap to exchange one
floating interest rate for another of a different tenor in the same currency)
arises endogenously. This rollover risk consists of two components, a credit
risk component due to the possibility of being downgraded and thus facing a
higher credit spread when attempting to roll over short-term borrowing, and a
component reflecting the (systemic) possibility of being unable to roll over
short-term borrowing at the reference rate (e.g., LIBOR) due to an absence of
liquidity in the market. The modelling framework is of "reduced form" in the
sense that (similar to the credit risk literature) the source of credit risk is
not modelled (nor is the source of liquidity risk). However, the framework has
more structure than the literature seeking to simply model a different term
structure of interest rates for each tenor frequency, since relationships
between rates for all tenor frequencies are established based on the modelled
roll-over risk. We proceed to consider a specific case within this framework,
where the dynamics of interest rate and roll-over risk are driven by a
multifactor Cox/Ingersoll/Ross-type process, show how such model can be
calibrated to market data, and used for relative pricing of interest rate
derivatives, including bespoke tenor frequencies not liquidly traded in the
market.
]]>834095http://www.moneyscience.com/pg/blog/arXiv/read/834095/a-consistent-stochastic-model-of-the-term-structure-of-interest-rates-for-multiple-tenors-arxiv180906643v1-qfinprhttp://www.moneyscience.com/pg/blog/arXiv/read/834094/dynamical-variety-of-shapes-in-financial-multifractality-arxiv180906728v1-qfinstTue, 18 Sep 2018 19:36:51 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/YCPcy_lmMLQ/dynamical-variety-of-shapes-in-financial-multifractality-arxiv180906728v1-qfinst
<![CDATA[Dynamical variety of shapes in financial multifractality. (arXiv:1809.06728v1 [q-fin.ST])]]>The concept of multifractality offers a powerful formal tool to filter out
multitude of the most relevant characteristics of complex time series. The
related studies thus far presented in the scientific literature typically limit
themselves to evaluation of whether or not a time series is multifractal and
width of the resulting singularity spectrum is considered a measure of the
degree of complexity involved. However, the character of the complexity of time
series generated by the natural processes usually appears much more intricate
than such a bare statement can reflect. As an example, based on the long-term
records of S&P500 and NASDAQ - the two world leading stock market indices - the
present study shows that they indeed develop the multifractal features, but
these features evolve through a variety of shapes, most often strongly
asymmetric, whose changes typically are correlated with the historically most
significant events experienced by the world economy. Relating at the same time
the index multifractal singularity spectra to those of the component stocks
that form this index reflects the varying degree of correlations involved among
the stocks.
]]>834094http://www.moneyscience.com/pg/blog/arXiv/read/834094/dynamical-variety-of-shapes-in-financial-multifractality-arxiv180906728v1-qfinsthttp://www.moneyscience.com/pg/blog/arXiv/read/834093/on-expansions-for-the-blackscholes-prices-and-hedge-parameters-arxiv180906736v1-qfinprTue, 18 Sep 2018 19:36:51 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/3XvA0VCLl7k/on-expansions-for-the-blackscholes-prices-and-hedge-parameters-arxiv180906736v1-qfinpr
<![CDATA[On expansions for the Black-Scholes prices and hedge parameters. (arXiv:1809.06736v1 [q-fin.PR])]]>We derive new formulas for the price of the European call and put options in
the Black-Scholes model, under the form of uniformly convergent series
generalizing previously known approximations. We also provide precise
boundaries for the convergence speed and apply the results to the calculation
of hedge parameters (Greeks).
]]>834093http://www.moneyscience.com/pg/blog/arXiv/read/834093/on-expansions-for-the-blackscholes-prices-and-hedge-parameters-arxiv180906736v1-qfinprhttp://www.moneyscience.com/pg/blog/arXiv/read/834092/a-generalized-framework-for-simultaneous-longshort-feedback-trading-arxiv180605561v2-qfintr-updatedTue, 18 Sep 2018 19:36:51 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/07tGsJH_Pgk/a-generalized-framework-for-simultaneous-longshort-feedback-trading-arxiv180605561v2-qfintr-updated
<![CDATA[A Generalized Framework for Simultaneous Long-Short Feedback Trading. (arXiv:1806.05561v2 [q-fin.TR] UPDATED)]]>We present a generalization of the Simultaneous Long-Short (SLS) trading
strategy described in recent control literature wherein we allow for different
parameters across the short and long sides of the controller; we refer to this
new strategy as Generalized SLS (GSLS). Furthermore, we investigate the
conditions under which positive gain can be assured within the GSLS setup for
both deterministic stock price evolution and geometric Brownian motion. In
contrast to existing literature in this area (which places little emphasis on
the practical application of SLS strategies), we suggest optimization
procedures for selecting the control parameters based on historical data, and
we extensively test these procedures across a large number of real stock price
trajectories (495 in total). We find that the implementation of such
optimization procedures greatly improves the performance compared with fixing
control parameters, and, indeed, the GSLS strategy outperforms the simpler SLS
strategy in general.
]]>834092http://www.moneyscience.com/pg/blog/arXiv/read/834092/a-generalized-framework-for-simultaneous-longshort-feedback-trading-arxiv180605561v2-qfintr-updatedhttp://www.moneyscience.com/pg/blog/arXiv/read/834028/kernelbased-collocation-methods-for-heathjarrowmorton-models-with-musiela-parametrization-arxiv180905643v1-qfincpMon, 17 Sep 2018 19:56:53 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/jLT4CuOG4MY/kernelbased-collocation-methods-for-heathjarrowmorton-models-with-musiela-parametrization-arxiv180905643v1-qfincp
<![CDATA[Kernel-based collocation methods for Heath-Jarrow-Morton models with Musiela parametrization. (arXiv:1809.05643v1 [q-fin.CP])]]>We propose kernel-based collocation methods for numerical solutions to
Heath-Jarrow-Morton models with Musiela parametrization. The methods can be
seen as the Euler-Maruyama approximation of some finite dimensional stochastic
differential equations, and allow us to compute the derivative prices by the
usual Monte Carlo methods. We derive a bound on the rate of convergence under
some decay condition on the inverse of the interpolation matrix and some
regularity conditions on the volatility functionals.
]]>834028http://www.moneyscience.com/pg/blog/arXiv/read/834028/kernelbased-collocation-methods-for-heathjarrowmorton-models-with-musiela-parametrization-arxiv180905643v1-qfincphttp://www.moneyscience.com/pg/blog/arXiv/read/834027/trends-in-the-diffusion-of-misinformation-on-social-media-arxiv180905901v1-cssiMon, 17 Sep 2018 19:56:52 -0500
http://feedproxy.google.com/~r/FinancialResearchFocus/~3/DcfSmjZ1TnA/trends-in-the-diffusion-of-misinformation-on-social-media-arxiv180905901v1-cssi
<![CDATA[Trends in the Diffusion of Misinformation on Social Media. (arXiv:1809.05901v1 [cs.SI])]]>We measure trends in the diffusion of misinformation on Facebook and Twitter
between January 2015 and July 2018. We focus on stories from 570 sites that
have been identified as producers of false stories. Interactions with these
sites on both Facebook and Twitter rose steadily through the end of 2016.
Interactions then fell sharply on Facebook while they continued to rise on
Twitter, with the ratio of Facebook engagements to Twitter shares falling by
approximately 60 percent. We see no similar pattern for other news, business,
or culture sites, where interactions have been relatively stable over time and
have followed similar trends on the two platforms both before and after the
election.
]]>834027http://www.moneyscience.com/pg/blog/arXiv/read/834027/trends-in-the-diffusion-of-misinformation-on-social-media-arxiv180905901v1-cssi