Abstract. Kramer's sampling theorem, a generalization of Shannon's sampling theorem, states that a function $f$ which is representable as a finite integral transform can be reconstructed from sample values $f(t_k)$ in terms of a series expansion with respect to a complete orthogonal set. The aim of this paper is to investigate the error occurring when this expansion is used for a function $f$ which is representable as an infinite rather than as a finite integral transform. In particular, it is shown that in many applications this error tends to zero when the distance between the sampling points $t_k$ tends to zero.
AMS Subject Classification
(1991): 94A12, 41A25, 41A58, 34B24
Received October 27, 1994, and in revised form February 28, 1995. (Registered under 5619/2009.)
|