Hybrid Classical-quantum Computing: Applications to Statistical Mechanics of Financial Markets

Hybrid Classical-Quantum computing has already arrived at several commercial quantum computers, offered to researchers and businesses. Here, applications are made to a model of financial options, Statistical Mechanics of Financial Markets (SMFM). These applications were published in many papers since the 1980's. This project only uses Classical (super-)computers to include quantum features of these models. Since 1989, an optimization code, Adaptive Simulated Annealing (ASA), has been to fit parameters in such models. Since 2015, a path-integral algorithm, PATHINT, used previously to accurately describe several systems in several disciplines, has been generalized from 1 dimension to N dimensions, and from classical to quantum systems, qPATHINT. Published papers have described the use of qPATHINT to neocortical interactions and financial options. The classical space by SMFM applies nonlinear nonequilibrium multivariate statistical mechanics to fit parameters in conditional short-time probability distributions, while the quantum space described by qPATHINT deals specifically with quantum systems, e.g., quantum money. This project thereby demonstrates how some hybrid classical-quantum systems may be calculated quite well using only classical (super-)computers.


SMNI
Previous papers since 1981 have have calculated properties of a model of neocortex, Statistical Mechanics of Neocortical Interactions (SMNI), to fit/describe many experimental data, e.g., electroencephalographic (EEG) data using a model of quantum wave-packets composed of a specific class of Ca 2+ ions that are (re-)generated at tripartite neuron-astrocyte-neuron sites, thereby influencing synaptic interactions (Ingber, 2018). Since 2011 (Ingber, 2011(Ingber, , 2012a, Both classical and quantum calculations have been shown to be consistent with an interaction between the momenta p of these wave-packets with a magnetic vector potential A that is generated by highly synchronous firings of many thousands of neocortical neurons during tasks requiring short-term memory (STM). A closed-form ("analytic") path-integral calculation of this quantum process developed a wave-function and an expectation value of p in the presence of A, thereby defining an example of quantum interactions coupled to a macroscopic system. An important reason for further addressing this particular system using these codes is to enable more realistic inclusion of shocks to the wave-packet wave-function due to the regenerative process of the wave-packet, e.g., due to collisions between Ca 2+ ions in the wave-packet causing some ions to leave the wave-packet during its hundreds of msec lifetime, or as new ions enter from the astrocyte processes. These may be considered as random processes (Ross, 2012). The codes PATHTREE/qPATHTREE and PATHINT/qPATHINT have been designed to include shocks in the evolution of a short-time probability distribution over thousands of foldings (Ingber, 2016(Ingber, , 2017a. More details on how hybrid quantum-classical computing is being applied to this system currently is in another companion paper (Ingber, 2021b).

Organization of paper
Section 2 further describes Adaptive Simulated Annealing (ASA) in the context of this project. Section 3 further describes PATHINT/qPATHINT in the context of this project. Section 4 further describes SMFM in the context of this project. Section 5 describes performance and scaling issues. Section 6 is the Conclusion.

ASA Algorithm
For parameters sampling with the random variable x i the default generating function is in terms of "temperatures" for parameters (Ingber, 1989) The default ASA uses the same type of annealing schedule for the acceptance function h as used for the generating function g. All default functions in ASA can be overridden with user-defined functions (Ingber, 1993(Ingber, , 2012b. Recently, ASA has been applied to studies of COVID-19, fitting forms like xS y , for variables S and parameters x and y, in the drifts and covariances of conditional probability distributions (Ingber, 2021c).
ASA has over 150 OPTIONS to provide robust tuning over many classes of nonlinear stochastic systems. These many OPTIONS help ensure that ASA can be used robustly across many classes of systems.
The "QUENCHing" OPTIONS are among the most important for controlling Adaptive Simulated Annealing. Fuzzy ASA algorithms in particular offer ways of controlling how these QUENCHing OPTIONS may be applied across many classes of problems.
In the context of this project, ASA has an ASA SAVE BACKUP OPTIONS which periodically saves all information (including generated random numbers) sufficient to restart if it is interrupted, e.g., using the ASA EXIT ANYTIME OPTIONS to simply remove a file "asa exit anytime" which causes ASA to gracefully exit. In this project, ASA removes "asa exit anytime" each 47 hours.

Generic Applications
There are many systems that are well defined by (a) Fokker-Planck/Chapman-Kolmogorov partialdifferential equations, (b) Langevin coupled stochastic-differential equations, and (c) Lagrangian or Hamiltonian path-integrals. All three such systems of equations are mathematically equivalent, when care is taken to properly take limits of discretized variables in the well-defined induced Riemannian geometry of the system due to nonlinear and time-dependent diffusions (Ingber, 1982(Ingber, , 1983Langouche et al., 1982;Schulman, 1981).

Path-Integral Algorithm
The path integral of a classical system of N variables indexed by i, at multiple times indexed by ρ, is defined in terms of its Lagrangian L: Here the diagonal diffusion terms are g ii and the drift terms are g i . If the diffusions terms are not constant, then there are additional terms in the drift, and in a Riemannian-curvature potential R/6 for dimension > 1 in the midpoint Stratonovich/Feynman discretization (Langouche et al., 1982). The path-integral approach is particularly useful to precisely define intuitive physical variables from the Lagrangian L in terms of its underlying variables q i : E.g., Canonical Momenta Indicators (CMI = Π i ) were used successfully in neuroscience (Ingber, , 1997(Ingber, , 1998, combat analyses (Bowman & Ingber, 1997), and financial markets . The histogram procedure recognizes that the distribution can be numerically approximated to a high degree of accuracy by sums of rectangles of height P i and width ∆q i at points q i . For convenience only, just consider a one-dimensional system. In the prepoint Ito discretization, the path-integral representation can be written in terms of the kernel G, for each of its intermediate integrals, as This yields T ij is a banded matrix representing the Gaussian nature of the short-time probability centered about the (possibly time-dependent) drift.
Explicit dependence of L on time t also has been included without complications. Care must be used in developing the mesh ∆q i , which is strongly dependent on diagonal elements of the diffusion matrix, e.g., This constrains the dependence of the covariance of each variable to be a (nonlinear) function of that variable to present an approximately rectangular underlying mesh. Since integration is inherently a smoothing process (Ingber, 1990), fitting the data with integrals over the short-time probability distribution, this permits the use of coarser meshes than the corresponding stochastic differential equation(s). For example, the coarser resolution is appropriate, typically required, for a numerical solution of the time-dependent path integral. By considering the contributions to the first and second moments, conditions on the time and variable meshes can be derived (Wehner & Wolfer, 1983a). For non-zero drift, the time slice may be determined by a scan of ∆t ≤L −1 , wherē L is the uniform/static Lagrangian, respecting ranges giving the most important contributions to the probability distribution P .

Direct Kernel Evaluation
Several projects have used this algorithm (Ingber & Nunez, 1995;Ingber & Wilson, 1999;Wehner & Wolfer, 1983a,b, 1987. Special 2-dimensional codes were developed for specific projects in Statistical Mechanics of Combat (SMC) (Ingber et al., 1991), SMNI (Ingber & Nunez, 1995), and Statistical Mechanics of Financial Markets (SMFM) (Ingber, 2000). The previous 1-D PATHINT code was generalized by the author to be run under N dimensions, by using 'make D=N' in the GCC Makefile. Then, a quantum generalization was made to the code, changing all variables and functions to complex variables, encompassing about 7500 lines of PATHINT code. The generic N-dimensional code was developed for classical and quantum systems, using a small shell script called from a Makefile to set up pre-compile options (Ingber, 2016(Ingber, , 2017a.

Monte Carlo vs Kernels
Many path-integral numerical applications use Monte Carlo techniques (O'Callaghan & Miller, 2014). E.g., this approach includes the author's ASA code using its ASA SAMPLE OPTIONS (Ingber, 1993). However, this project is concerned with serial (time-sequential) random shocks, not conveniently treated with Monte-Carlo/importance-sampling algorithms.

Scaling Issues
The use of qPATHINT has been tested with shocks to quantum options wave-functions (Ingber, 2017b), serving to illustrate some computational scaling issues, further described in the Performance and Scaling Section.

Imaginary Time
Imaginary-time Wick rotations transform imaginary time (the primary source of imaginary dependencies) into a real-variable time. However, when used with numerical calculations, after multiple foldings of the path integral, usually there is no audit trail back to imaginary time to extract phase information (private communication with several authors of path-integral papers, including Larry Schulman on 18 Nov 2015) (Schulman, 1981).

SMFM With qPATHINT
The above considerations define a clear process of application of fitting a volatility of volatility options model with classical computation, with qPATHINT numerically calculating the path-integral at each time between epochs defined by the quantum mesh. At the beginning of each epoch, time is reset (t = 0) since the wave-function is considered have been decohered ("collapsed") by a prior American stop-measurement; until the end of that epoch there are multiple calls to quantum functions to calculate the evolution of the conditional short-time probability distribution, and each call also calls qPATHINT for numerical calculation of the path-integral.

SMFM 2-D
Our two-factor model includes stochastic volatility σ of the underlying S, where S 0 and S ∞ are selected to lie outside the data region used to fit the other parameters, e.g., S 0 = 1 and S ∞ = 20 for fits to Eurodollar futures which historically have a very tight range relative to other markets. We have used the Black-Scholes form F = S inside S < S 0 to obtain the usual benefits, e.g., no negative prices as the distribution is naturally excluded from S < 0 and preservation of put-call parity. Put-call parity for European options is derived quite independent of any mathematical model of options (Hull, 1997). In its simplest form, it is given by where c (p) is the fair price of a call (put), X is the strike price, r is the risk-free interest rate, t is the present time, T is the time of expiration, and S is the underlying market. We have taken y = 0, a normal distribution, to reflect total ignorance of markets outside the range of S > S ∞ .
The one-factor model just assumes a constant σ. It is often noted that BS models incorrectly include contributions from large S regions because of their fat tails (Fabozzi, 1998). (If we wished to handle negative interest rates, ED prices > 100, we would move/shift the S = 0 axis to some S < 0 value.) We found that the abrupt, albeit continuous, changes across S 0 especially for x ≤ 0 did not cause any similar effects in the distributions evolved using these diffusions, as reported below.
The formula for pricing an option P , derived in a Black-Scholes generalized framework after factoring out interest-rate discounting, is equivalent to using the form We experimented with some alternative functional forms, primarily to apply some smooth cutoffs across the above three regions of S. For example, we used F , a function F designed to revert to the lognormal Black-Scholes model in several limits, However, our fits were most sensitive to the data when we permitted the central region to be pure S x using F above.

Two-Factor Volatility and PATHINT Modifications
In our two-factor model, the mesh of S would depend on σ and cause some problems in any PATHINT grid to be developed in S-σ. For some time we have considered how to handle this generic problem for n-factor multivariate systems with truly multivariate diffusions within the framework of PATHINT. In one case, we have taken advantage of the Riemannian invariance of the probability distribution as discussed above, to transform to a system where the diffusions have only "diagonal" multiplicative dependence (Ingber, 1994). However, this leads to cumbersome numerical problems with the transformed boundary conditions (Ingber & Nunez, 1995). Another method, not yet fully tested, is to develop a tiling of diagonal meshes for each factor i that often are suitable for off-diagonal regions in an n-factor system, e.g., where the mesh of variable i at a given point labeled by k is an exponentiation of 2, labeled by m i k ; the integral power m i k is determined such that it gives a good approximation to the diagonal mesh given by the one-factor PATHINT mesh conditions, in terms of some minimal mesh ∆M i 0 , throughout regions of the Lagrangian giving most important contributions to the distribution as predetermined by a scan of the system. This tiling of the kernel is to be used together with interpolation of intermediate distributions.
The results of our study here are that, after the at-the-money BPV are scaled to be equivalent, there is not a very drastic change in the one-factor ATM Greeks. Therefore, while we have not at all changed the functional dependence of the Lagrangian on S and σ, we have determined our meshes using a diffusion for the S equation as σ 0 F (S, S 0 , S ∞ , x, y), where σ 0 is determined by the same BPV-equivalent condition as imposed on the one-factor models. This seems to work very well, especially since we have taken our σ equation to be normal with a limited range of influence in the calculations. Future work yet has to establish a more definitive distribution

Previous XSEDE SMFM Project
It seems that "if" QM is not much of an issue. When QM does arrive, it is clear that options on quantum markets will be required for purposes of hedging and speculation.

Options Calculations
A value of 9 off-diagonal terms are used on each side of the diagonal kernel. The model uses a noise of S x , where S is the underlying price and x is an exponent. The underlying price is taken to be 7.0. A strike value of 7.5 is used for this table. The risk-free rate is taken to be 0.0675. The cost of carry is taken to be 0. A daily volatility of 0.00793725 is used, and this parameter is taken to be real for both PATHINT and qPATHINT.
There is no additional drift added, but a drift arises from the nonlinear noise used (Ingber, 2000;Ingber & Wilson, 1999). In this context, note that shocks can affect Greeks with "p" quite severely, where "p" denotes an additional order of derivatives, e.g., VegapPI (second derivative of Υ with respect to volatility) is very sensitive to shocks in this particular drift as described above in the section Serial Shocks.
Results are given in a previous paper (Ingber, 2017b).

Current Project
The current project uses ASA to fit volatility of volatility options using short-time conditional probability distributions, similar to what was previously performed (Ingber, 2000), but now these forms are based on quantum, instead of classical, money, as described above.

Performance and Scaling
Code is used from a previous XSEDE grant "Quantum path-integral qPATHTREE and qPATHINT algorithm", for qPATHINT runs.

SMNI Scaling Estimates
As an example, SMNI estimates were made on XSEDE.org's Expanse using 'gcc -O3', and for just the one-dimentional system. In this context, all debugging flags are not used (e.g., '-g') unless specifically noted otherwise.
The current code uses a variable mesh covering 1121 points along the diagonal, with a maximum off-diagonal spread of 27; corners require considerable CPU time to take care of boundaries, etc. Oscillatory wave functions require a large off-diagonal spread (Ingber, 2017a).
If dt = 0.0002, this requires 10 foldings of the distribution. This takes the code 0.002s to run, giving 0.0002/qIteraction. Note that with '-g' the code takes 0.004s to run.
Note the maximum duration of a normal XSEDE.org job is 2 days. ASA has built in a simple way of ending jobs with printout required to restart, including sets of random numbers generated, so this is quite feasible.

SMNI XSEDE Ticket with Mahidhar Tatineni
In tickets.xsede.org #148054, dealing with SMNI runs (using 1-D qPATHINT), Mahidhar Tatineni replied: "Thanks, Lester. That makes it clear. I think in this case using the "shared" partition with array jobs will be perfect and you can restart every 48 hours (make sure you say this in the proposal so that reviewers are aware you can restart).
So you need a total of 24 x 140 = 3360 SUs for each set which is completely reasonable. If you do 100 such sets you will need 350K SUs which is completely fine from a request point of view (as long as the runs are justified for the science being done and there is a clear computational plan associated with it). Mahidhar"

Scaling Estimates N-D
Some N-dim qPATHINT runs for SMFM used a contrived N-factor model with the same 1-D system cloned in all dimensions (each unit is a "double complex"): The kernel size is (I J) N , where I = imxall, J = jmxall (= kernel band width), and kernel size = ijkcnt. This spatial mesh might change at each time slice. Thus, the 2-D problem could take on the order of 200 times the 1-D problem, long but quite feasible. If the length of time becomes an issue, e.g.,for dimensions >2 and a high degree of nonlinearity, then fits of the drifts and covariance matrices to parameterized forms can be a very good option (Ingber, 2020). of

Conclusion
Published pilot studies give a rationale for further developing this particular quantum path-integral algorithm based on folding kernels, as this can be used to study serial random shocks that occur in many real systems. Furthermore, this quantum version can be used for many quantum systems, which are becoming increasingly important as experimental data is increasing at a rapid pace for many quantum systems.