论文标题
Q-NET:神经代理低维积分网络
Q-NET: A Network for Low-Dimensional Integrals of Neural Proxies
论文作者
论文摘要
许多应用需要计算多维函数的积分。一般且流行的程序是通过平均该功能的多次评估来估计积分。通常,对功能的每个评估都需要昂贵的计算。如果需要重复评估,则使用\ emph {proxy}或替代替代功能是有用的。如果代理在分析上已知并且可以实际计算,则代理将更加有用。我们建议使用一种多功能但简单的人工神经网络(Sigmoidal通用近似值)作为需要估算积分的函数的代理。我们设计了一个固定网络家族,我们称为Q-Net,该家族在训练有素的代理的参数上运行,以计算输入域的\ emph {任何尺寸的任何子集}的精确积分。我们确定可以重新估算积分的输入空间的转换,而无需重新采样或重新采集代理。我们强调了该方案对诸如逆渲染,程序噪声,可视化和仿真等一些应用的好处。拟议的代理在以下情况下吸引人:维度很低($ <10 $ d);积分的估计需要与采样策略分离;使用稀疏的自适应抽样;边际功能需要以功能形式知道;或当功能强大的单个指令时,多个数据/线程(SIMD/SIMT)管道可用于计算。
Many applications require the calculation of integrals of multidimensional functions. A general and popular procedure is to estimate integrals by averaging multiple evaluations of the function. Often, each evaluation of the function entails costly computations. The use of a \emph{proxy} or surrogate for the true function is useful if repeated evaluations are necessary. The proxy is even more useful if its integral is known analytically and can be calculated practically. We propose the use of a versatile yet simple class of artificial neural networks -- sigmoidal universal approximators -- as a proxy for functions whose integrals need to be estimated. We design a family of fixed networks, which we call Q-NETs, that operate on parameters of a trained proxy to calculate exact integrals over \emph{any subset of dimensions} of the input domain. We identify transformations to the input space for which integrals may be recalculated without resampling the integrand or retraining the proxy. We highlight the benefits of this scheme for a few applications such as inverse rendering, generation of procedural noise, visualization and simulation. The proposed proxy is appealing in the following contexts: the dimensionality is low ($<10$D); the estimation of integrals needs to be decoupled from the sampling strategy; sparse, adaptive sampling is used; marginal functions need to be known in functional form; or when powerful Single Instruction Multiple Data/Thread (SIMD/SIMT) pipelines are available for computation.