نتایج جستجو برای: non differentiable physics

تعداد نتایج: 1493706  

2001
B. G. Sidharth

Much of twentieth century physics, whether it be Classical or Quantum, has been based on the concept of spacetime as a differentiable manifold. While this work has culminated in the standard model, it is now generally accepted that in the light of recent experimental results, we have to go beyond the standard model. On the other hand Quantum SuperString Theory and a recent model of Quantized Sp...

2008
Nadav Drukker

We show that some novel physics of supertubes removes closed time-like curves from many supersymmetric spaces which naively suffer from this problem. The main claim is that supertubes naturally form domain-walls, so while analytical continuation of the metric would lead to closed time-like curves, across the domain-wall the metric is non-differentiable, and the closed time-like curves are elimi...

We firstly establish an identity for $n$ time differentiable mappings Then, a new inequality for $n$ times differentiable functions is deduced. Finally, some perturbed Ostrowski type inequalities for functions whose $n$th derivatives are of bounded variation are obtained.

2000
Laurent Nottale

The theory of scale relativity extends Einstein’s principle of relativity to scale transformations of resolutions. It is based on the giving up of the axiom of differentiability of the space-time continuum. Three consequences arise from this withdrawal. (i) The geometry of space-time becomes fractal, i.e., explicitly resolutiondependent : this allows one to describe a non-differentiable physics...

Journal: :Journal of Statistical Mechanics: Theory and Experiment 2012

Journal: :Numerical Algorithms 2022

Abstract We consider a generic type of nonlinear Hammerstein-type integral equations with the particularity having non-differentiable kernel Nemystkii type. So, in order to solve it we uniparametric family iterative processes derivative free, main advantage that for special value involved parameter method obtained coincides Newton’s method, is due fact evaluating divided difference operator whe...

Journal: :Neurocomputing 2023

Gradient boosting is a prediction method that iteratively combines weak learners to produce complex and accurate model. From an optimization point of view, the learning procedure gradient mimics descent on functional variable. This paper proposes build upon proximal algorithm, when empirical risk minimize not differentiable, in order introduce novel approach, called boosting. It comes with comp...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید