Somabha Mukherjee

Somabha Mukherjee
  • PhD Student

Contact Information

  • office Address:

    451 Jon M. Huntsman Hall
    3730 Walnut Street
    Philadelphia, PA 19104

Research

  • Somabha Mukherjee and Bhaswar B. Bhattacharya (Work In Progress), Replica symmetry in upper tails of mean-field hypergraphs.

  • Debapratim Banerjee, Arun Kumar Kuchibhotla, Somabha Mukherjee (Work In Progress), Cramer-type Large deviation and non-uniform central limit theorems in high dimensions.

    Abstract: Central limit theorems (CLTs) for high-dimensional random vectors with dimension possibly growing with the sample size have received a lot of attention in the recent times. Chernozhukov et al., (2017) proved a Berry--Esseen type result for high-dimensional averages for the class of hyperrectangles and they proved that the rate of convergence can be upper bounded by n^{-1/6} upto a polynomial factor of logp (where n represents the sample size and p denotes the dimension). In the classical literature on central limit theorem, various non-uniform extensions of the Berry--Esseen bound are available. Similar extensions, however, have not appeared in the context of high-dimensional CLT. This is the main focus of our paper. Based on the classical large deviation and non-uniform CLT results for random variables in a Banach space by Bentkus, Rackauskas, and Paulauskas, we prove three non-uniform variants of high-dimensional CLT. In addition, we prove a dimension-free anti-concentration inequality for the absolute supremum of a Gaussian process on a compact metric space.

  • Bhaswar B. Bhattacharya, Somabha Mukherjee, Sumit Mukherjee (Working), Birthday paradox, monochromatic subgraphs, and the second moment phenomenon.

  • Arun Kumar Kuchibhotla, Somabha Mukherjee, Ayanendranath Basu (Draft), Statistical Inference based on Bridge Divergences.

    Description: M-estimators offer simple robust alternatives to the maximum likelihood estimator. Much of the robustness literature, however, has focused on the problems of location, location-scale and regression estimation rather than on estimation of general parameters. The density power divergence (DPD) and the logarithmic density power divergence (LDPD) measures provide two classes of competitive M-estimators (obtained from divergences) in general parametric models which contain the MLE as a special case. In each of these families, the robustness of the estimator is achieved through a density power down-weighting of outlying observations. Both the families have proved to be very useful tools in the area of robust inference. However, the relation and hierarchy between the minimum distance estimators of the two families are yet to be comprehensively studied or fully established. Given a particular set of real data, how does one choose an optimal member from the union of these two classes of divergences? In this paper, we present a generalized family of divergences incorporating the above two classes; this family provides a smooth bridge between the DPD and the LDPD measures. This family helps to clarify and settle several longstanding issues in the relation between the important families of DPD and LDPD, apart from being an important tool in different areas of statistical inference in its own right.

Teaching

Past Courses

  • STAT111 - INTRODUCTORY STATISTICS

    Introduction to concepts in probability. Basic statistical inference procedures of estimation, confidence intervals and hypothesis testing directed towards applications in science and medicine. The use of the JMP statistical package.