Research provisions

The research preparation do exactly what it is to be done - providing the framework or groundwork in which, deliberately, alongside the directive could be conducted, and so on. Thus, we have two categories only as the main focal part, the in preparations papers, and uncategorized list of papers in proposals, of which should the need arises, will be categorized per-field and per-directive. The current bulletin can be found here for active current projects and papers.

In preparations

In preparation, not conducted actively, but building logistic and infrastructures (intellectually or whatever).

Physics

  • Expository papers (quantum theory) - entry 33 in previous conception.
  • A paper on Quantum-like systems and emergence of quantum method or rather as the old name - Dynamic modelling, and the naturalness of quantum-similar system quantization., basically that, we try the masking, proxical-style configuration that is present in quantum system proposition (that is, the mismatch of results, at least when it comes to classical apparatus and observations), and test it on the ground of the new philosophical analysis of modelling we did back then. If, the first quantization comes naturally as encoding, then perhaps the second quantization will also follow. And it might explain or apply in which what can be done of the chaotic useless landscape that AI has, basically abstractive localization, considering higher diffusion area then fully numeric details, relatively. The strength is that we are doing quantum-like, not quantum. If it can be generalized such that it is an effective treatment for such ‘game of discrepancies’, then perhaps it can reveal the paradox, and thus the naturalness of quantum mechanics. RP01
  • A paper on Low-dimensional semiconductors - quantum well and analysis - review and summarizing in purpose expository paper. RP02

Uncategorized

Because of the scale of research topics, it is evident to not categorize such approaches or so on, as of current notions. Those in preparation or when I have times to organize them will make haste of such to the above sections when sufficient of contents.

Double descent: a review

(review, reference): A reference and review paper talking about double descent. Including effort to formalize the definition of it.

Interpreting double descent in polynomial via density estimation

(original, theory): A fairly light and theoretical paper on the toy model description of polynomial model and different interpretation.

Double descent on support vector machine and polynomial models on binary classification – an analysis

(experimental, theory): A both theory and experimental paper.

PINNs and double descent

(experimental): An experimental on PINNs and how to identify double descent on it.

Classical learning theory and future

(original, review, experiments).

A review of model structures in machine learning and theoretical machine learning

(theoretical, review, reference): A reference paper on the concept of models in machine learning, their structures, flavour, consideration, and thereof.

Different treatment of machine learning – an analysis

(review, reference, original): A paper on the different treatment of machine learning, from different lens and scale, and how do they fit together (categorical machine learning for example).

Reuniting the neural network framework using partial category theory

(original, theory): Using the proposed categorical machine learning or neural network to consider reuniting different architecture together under the same framework.

On the capacity and capability of neural networks

(review, original): A paper analysing the capacity and capability of neural networks on various tasks and setting thereof, and so forth, considering different structures.

Machine learning from a mathematical modelling view

(review, original): Proposing a modelling theoretic, internal state interpretation to machine learning.

The analysis of statistical physics on machine learning

(review, original): Setting up a review work on how statistical physics is related to machine learning, where is it failing and what constitute the failure.

Classical model into neural network architecture – an analysis

(original, theoretical): An attempt to attack and formalize classical models into neural network theoretic (unit-wise approach of neuronal unit) to classical models, hence gauging their capability.

Unsolved problems in theoretical machine learning

(original, review): A work in review of different interpretations and problems residing the current framework of machine learning.

Differential equations and machine learning – interpreting machine learning system using differential equations

(original, theory): An attempt to express ML systems in terms of differential equations, just as the differential model.

Dynamic modelling, and the naturalness of quantum-similar system quantization

Basically that: we try the masking, proximal-style configuration that is present in quantum system proposition, and test it on the ground of the new philosophical analysis of modelling we did back then. If the first quantization comes naturally as encoding, then perhaps the second quantization will also follow. And it might explain or apply in which what can be done of the chaotic useless landscape that AI has, basically abstractive localization, considering higher diffusion area then fully numeric details, relatively.

Neural network learning equals mathematical model structural approximation

(original, theory, experiment).

Concentration inequalities in theoretical machine learning for beginner

Structural addition in machine learning – an analysis

Component and structural realization of large language model and why they are not intelligent

Just to prove that LLM is not, well, intelligent and again, won’t ever be AGI.

Hallucination is bias–variance tradeoff

We just, well, connect it to them.

Visualization of neural network operation and layer theoretic

Time-sensitive processing network

Something that look like an operational system in the inner units of the neural network.

Always running neural network

Mimicking a network to, well, always run and is not static.

Visualizing dynamics and bias-variance plus double descent

Information theoretic interpretation of machine learning framework

Randomized neural network architecture on permutations

(theory, experiment, original): Just a way to do such analysis.

MultiNet – A new multidirectional structure of neural network formalism

(original, experimentation): Just a generalization of what I observed of neural network structure in general. Will have to base on existing facility, can’t do anything else.

Theoretical explanation for Neural Scaling Law

(original, theoretical): Just an attempt to solve the neural scaling law.

Reversibility or not of mechanistic-probabilistic systems

Now named the theory of the learnable (Valiant style for real). It is mostly based on one of my blog posts, and I want to elaborate such idea.

The test of modelling – component-based modelling and a toy model of analysis

(experimental): The first follow up papers since the AGI paper.

The Kuhnian cycle and the development of artificial intelligence theory

(theoretical, philosophical, sociological, theory of science, original, review): The consideration and analysis in which identify the Kuhnian cycles, the technical sophistry in question, and so on, of which is similar to a historic and field analysis itself.

An analysis on Navier–Stokes equation

(physics, original, review, reference, theoretical, mathematics, fluid dynamics): An analysis on Navier–Stokes equations system, the fluid dynamical system and what transpire such. Plus an additional inquiry on how can we reach a solution under such metric.

The illusion of correctness – how quantum mechanics is so correct?

(original partially, theoretical, review, analysis): About the correctness of quantum mechanics, other tangent theories, classical mechanical interpretations, and so on. Perhaps the tad bit description into the applications onto chemistry, quantum information and quantum computing, and how everything simply “works itself out”, and the classical limit reduction dilemma.

Quantum-like system and naturalness of the encoding language

(original, quantum physics, physics, theoretical): Offering an insight into how quantum-like systems give rise to apparently different formalism, techniques, descriptions and so on. This is particularly helpful, since there exist many different unresolved connotations about complex numbers \(\mathbb{C}\), for example.

Back to top