gogogo
Syndetics cover image
Image from Syndetics

Probability and statistics for computer scientists / Michael Baron

By: Material type: TextTextPublication details: Boca Raton, Fla CRC 2014Edition: 2nd edDescription: xxiv, 449 p. : ill. ; 26 cmISBN:
  • 9781439875902
  • 1439875901
Subject(s): DDC classification:
  • 519.2 BAR

Enhanced descriptions from Syndetics:

Student-Friendly Coverage of Probability, Statistical Methods, Simulation, and Modeling Tools
Incorporating feedback from instructors and researchers who used the previous edition, Probability and Statistics for Computer Scientists, Second Edition helps students understand general methods of stochastic modeling, simulation, and data analysis; make optimal decisions under uncertainty; model and evaluate computer systems and networks; and prepare for advanced probability-based courses. Written in a lively style with simple language, this classroom-tested book can now be used in both one- and two-semester courses.

New to the Second Edition

Axiomatic introduction of probability Expanded coverage of statistical inference, including standard errors of estimates and their estimation, inference about variances, chi-square tests for independence and goodness of fit, nonparametric statistics, and bootstrap More exercises at the end of each chapter Additional MATLAB¿ codes, particularly new commands of the Statistics Toolbox

In-Depth yet Accessible Treatment of Computer Science-Related Topics
Starting with the fundamentals of probability, the text takes students through topics heavily featured in modern computer science, computer engineering, software engineering, and associated fields, such as computer simulations, Monte Carlo methods, stochastic processes, Markov chains, queuing theory, statistical inference, and regression. It also meets the requirements of the Accreditation Board for Engineering and Technology (ABET).

Encourages Practical Implementation of Skills
Using simple MATLAB commands (easily translatable to other computer languages), the book provides short programs for implementing the methods of probability and statistics as well as for visualizing randomness, the behavior of random variables and stochastic processes, convergence results, and Monte Carlo simulations. Preliminary knowledge of MATLAB is not required. Along with numerous computer science applications and worked examples, the text presents interesting facts and paradoxical statements. Each chapter concludes with a short summary and many exercises.

Includes index.

Table of contents provided by Syndetics

  • List of Figures (p. xv)
  • List of Tables (p. xix)
  • Preface (p. xxi)
  • 1 Introduction and Overview (p. 1)
  • 1.1 Making decisions under uncertainty (p. 1)
  • 1.2 Overview of this book (p. 3)
  • Summary and conclusions (p. 5)
  • Exercises (p. 5)
  • 1 Probability and Random Variables (p. 7)
  • 2 Probability (p. 9)
  • 2.1 Events and their probabilities (p. 9)
  • 2.1.1 Outcomes, events, and the sample space (p. 10)
  • 2.1.2 Set operations (p. 11)
  • 2.2 Rules of Probability (p. 13)
  • 2.2.1 Axioms of Probability (p. 14)
  • 2.2.2 Computing probabilities of events (p. 15)
  • 2.2.3 Applications in reliability (p. 18)
  • 2.3 Combinatorics (p. 20)
  • 2.3.1 Equally likely outcomes (p. 20)
  • 2.3.2 Permutations and combinations (p. 22)
  • 2.4 Conditional probability and independence (p. 27)
  • Summary and conclusions (p. 32)
  • Exercises (p. 33)
  • 3 Discrete Random Variables and Their Distributions (p. 39)
  • 3.1 Distribution of a random variable (p. 40)
  • 3.1.1 Main concepts (p. 40)
  • 3.1.2 Types of random variables (p. 43)
  • 3.2 Distribution of a random vector (p. 44)
  • 3.2.1 Joint distribution and marginal distributions (p. 44)
  • 3.2.2 Independence of random variables (p. 45)
  • 3.3 Expectation and variance (p. 47)
  • 3.3.1 Expectation (p. 47)
  • 3.3.2 Expectation of a function (p. 48)
  • 3.3.3 Properties (p. 48)
  • 3.3.4 Variance and standard deviation (p. 49)
  • 3.3.5 Covariance and correlation (p. 51)
  • 3.3.6 Properties (p. 52)
  • 3.3.7 Chebyshev's inequality (p. 54)
  • 3.3.8 Application to finance (p. 55)
  • 3.4 Families of discrete distributions (p. 57)
  • 3.4.1 Bernoulli distribution (p. 57)
  • 3.4.2 Binomial distribution (p. 58)
  • 3.4.3 Geometric distribution (p. 60)
  • 3.4.4 Negative Binomial distribution (p. 63)
  • 3.4.5 Poisson distribution (p. 64)
  • 3.4.6 Poisson approximation of Binomial distribution (p. 66)
  • Summary and conclusions (p. 67)
  • Exercises (p. 68)
  • 4 Continuous Distributions (p. 75)
  • 4.1 Probability density (p. 75)
  • 4.2 Families of continuous distributions (p. 80)
  • 4.2.1 Uniform distribution (p. 80)
  • 4.2.2 Exponential distribution (p. 82)
  • 4.2.3 Gamma distribution (p. 84)
  • 4.2.4 Normal distribution (p. 89)
  • 4.3 Central Limit Theorem (p. 92)
  • Summary and conclusions (p. 95)
  • Exercises (p. 96)
  • 5 Computer Simulations and Monte Carlo Methods (p. 101)
  • 5.1 Introduction (p. 101)
  • 5.1.1 Applications and examples (p. 102)
  • 5.2 Simulation of random variables (p. 103)
  • 5.2.1 Random number generators (p. 104)
  • 5.2.2 Discrete methods (p. 105)
  • 5.2.3 Inverse transform method (p. 108)
  • 5.2.4 Rejection method (p. 110)
  • 5.2.5 Generation of random vectors (p. 113)
  • 5.2.6 Special methods (p. 113)
  • 5.3 Solving problems by Monte Carlo methods (p. 114)
  • 5.3.1 Estimating probabilities (p. 114)
  • 5.3.2 Estimating means and standard deviations (p. 118)
  • 5.3.3 Forecasting (p. 119)
  • 5.3.4 Estimating lengths, areas, and volumes (p. 120)
  • 5.3.5 Monte Carlo integration (p. 122)
  • Summary and conclusions (p. 124)
  • Exercises (p. 125)
  • II Stochastic Processes (p. 129)
  • 6 Stochastic Processes (p. 131)
  • 6.1 Definitions and classifications (p. 132)
  • 6.2 Markov processes and Markov chains (p. 133)
  • 6.2.1 Markov chains (p. 134)
  • 6.2.2 Matrix approach (p. 138)
  • 6.2.3 Steady-state distribution (p. 142)
  • 6.3 Counting processes (p. 148)
  • 6.3.1 Binomial process (p. 148)
  • 6.3.2 Poisson process (p. 152)
  • 6.4 Simulation of stochastic processes (p. 157)
  • Summary and conclusions (p. 160)
  • Exercises (p. 160)
  • 7 Queuing Systems (p. 167)
  • 7.1 Main components of a queuing system (p. 168)
  • 7.2 The Little's Law (p. 170)
  • 7.3 Bernoulli single-server queuing process (p. 173)
  • 7.3.1 Systems with limited capacity (p. 176)
  • 7.4 M/M/1 system (p. 178)
  • 7.4.1 Evaluating the system's performance (p. 181)
  • 7.5 Multiserver queuing systems (p. 185)
  • 7.5.1 Bernoulli k-server queuing process (p. 186)
  • 7.5.2 M/M/k systems (p. 189)
  • 7.5.3 Unlimited number of servers and M/M/∞ (p. 192)
  • 7.6 Simulation of queuing systems (p. 193)
  • Summary and conclusions (p. 197)
  • Exercises (p. 198)
  • III Statistics (p. 205)
  • 8 Introduction to Statistics (p. 207)
  • 8.1 Population and sample, parameters and statistics (p. 208)
  • 8.2 Simple descriptive statistics (p. 211)
  • 8.2.1 Mean (p. 211)
  • 8.2.2 Median (p. 213)
  • 8.2.3 Quantiles, percentiles, and quartiles (p. 217)
  • 8.2.4 Variance and standard deviation (p. 219)
  • 8.2.5 Standard errors of estimates (p. 221)
  • 8.2.6 Interquartile range (p. 222)
  • 8.3 Graphical statistics (p. 223)
  • 8.3.1 Histogram (p. 224)
  • 8.3.2 Stem-and-leaf plot (p. 227)
  • 8.3.3 Boxplot (p. 229)
  • 8.3.4 Scatter plots and time plots (p. 231)
  • Summary and conclusions (p. 233)
  • Exercises (p. 233)
  • 9 Statistical Inference I (p. 237)
  • 9.1 Parameter estimation (p. 238)
  • 9.1.1 Method of moments (p. 239)
  • 9.1.2 Method of maximum likelihood (p. 242)
  • 9.1.3 Estimation of standard errors (p. 246)
  • 9.2 Confidence intervals (p. 247)
  • 9.2.1 Construction of confidence intervals: a general method (p. 248)
  • 9.2.2 Confidence interval for the population mean (p. 250)
  • 9.2.3 Confidence interval for the difference between two means (p. 251)
  • 9.2.4 Selection of a sample size (p. 253)
  • 9.2.5 Estimating means with a given precision (p. 254)
  • 9.3 Unknown standard deviation (p. 255)
  • 9.3.1 Large samples (p. 255)
  • 9.3.2 Confidence intervals for proportions (p. 256)
  • 9.3.3 Estimating proportions with a given precision (p. 258)
  • 9.3.4 Small samples: Student's t distribution (p. 259)
  • 9.3.5 Comparison of two populations with unknown variances (p. 261)
  • 9.4 Hypothesis testing (p. 264)
  • 9.4.1 Hypothesis and alternative (p. 265)
  • 9.4.2 Type I and Type II errors: level of significance (p. 266)
  • 9.4.3 Level a tests: general approach (p. 267)
  • 9.4.4 Rejection regions and power (p. 269)
  • 9.4.5 Standard Normal null distribution (Z-test) (p. 270)
  • 9.4.6 Z-tests for means and proportions (p. 272)
  • 9.4.7 Pooled sample proportion (p. 274)
  • 9.4.8 Unknown ¿: T-tests (p. 275)
  • 9.4.9 Duality: two-sided tests and two-sided confidence intervals (p. 277)
  • 9.4.10 P-value (p. 280)
  • 9.5 Inference about variances (p. 285)
  • 9.5.1 Variance estimator and Chi-square distribution (p. 286)
  • 9.5.2 Confidence interval for the population variance (p. 287)
  • 9.5.3 Testing variance (p. 289)
  • 9.5.4 Comparison of two variances. F-distribution (p. 292)
  • 9.5.5 Confidence interval for the ratio of population variances (p. 294)
  • 9.5.6 F-tests comparing two variances (p. 296)
  • Summary and conclusions (p. 299)
  • Exercises (p. 300)
  • 10 Statistical Inference II (p. 305)
  • 10.1 Chi-square tests (p. 305)
  • 10.1.1 Testing a distribution (p. 306)
  • 10.1.2 Testing a family of distributions (p. 308)
  • 10.1.3 Testing independence (p. 310)
  • 10.2 Nonparametric statistics (p. 314)
  • 10.2.1 Sign test (p. 315)
  • 10.2.2 Wilcoxon signed rank test (p. 317)
  • 10.2.3 Mann-Whitney-Wilcoxon rank sum test (p. 322)
  • 10.3 Bootstrap (p. 328)
  • 10.3.1 Bootstrap distribution and all bootstrap samples (p. 328)
  • 10.3.2 Computer generated bootstrap samples (p. 333)
  • 10.3.3 Bootstrap confidence intervals (p. 335)
  • 10.4 Bayesian inference (p. 339)
  • 10.4.1 Prior and posterior (p. 340)
  • 10.4.2 Bayesian estimation (p. 345)
  • 10.4.3 Bayesian credible sets (p. 347)
  • 10.4.4 Bayesian hypothesis testing (p. 351)
  • Summary and conclusions (p. 352)
  • Exercises (p. 353)
  • 11 Regression (p. 361)
  • 11.1 Least squares estimation (p. 362)
  • 11.1.1 Examples (p. 362)
  • 11.1.2 Method of least squares (p. 364)
  • 11.1.3 Linear regression (p. 365)
  • 11.1.4 Regression and correlation (p. 367)
  • 11.1.5 Overfitting a model (p. 368)
  • 11.2 Analysis of variance, prediction, and further inference (p. 369)
  • 11.2.1 ANOVA and R-square (p. 369)
  • 11.2.2 Tests and confidence intervals (p. 371)
  • 11.2.3 Prediction (p. 377)
  • 11.3 Multivariate regression (p. 381)
  • 11.3.1 Introduction and examples (p. 381)
  • 11.3.2 Matrix approach and least squares estimation (p. 382)
  • 11.3.3 Analysis of variance, tests, and prediction (p. 384)
  • 11.4 Model building (p. 390)
  • 11.4.1 Adjusted R-square (p. 390)
  • 11.4.2 Extra sum of squares, partial F-tests, and variable selection (p. 391)
  • 11.4.3 Categorical predictors and dummy variables (p. 394)
  • Summary and conclusions (p. 397)
  • Exercises (p. 397)
  • IV Appendix (p. 403)
  • 12 Appendix (p. 405)
  • 12.1 Inventory of distributions (p. 405)
  • 12.1.1 Discrete families (p. 405)
  • 12.1.2 Continuous families (p. 407)
  • 12.2 Distribution tables (p. 411)
  • 12.3 Calculus review (p. 428)
  • 12.3.1 Inverse function (p. 428)
  • 12.3.2 Limits and continuity (p. 428)
  • 12.3.3 Sequences and series (p. 429)
  • 12.3.4 Derivatives, minimum, and maximum (p. 429)
  • 12.3.5 Integrals (p. 431)
  • 12.4 Matrices and linear systems (p. 434)
  • 12.5 Answers to selected exercises (p. 439)
  • Index (p. 445)

Author notes provided by Syndetics

Michael Baron is a professor of statistics at the University of Texas at Dallas. He has published two books and numerous research articles and book chapters. Dr. Baron is a fellow of the American Statistical Association, a member of the International Society for Bayesian Analysis, and an associate editor of the Journal of Sequential Analysis. In 2007, he was awarded the Abraham Wald Prize in Sequential Analysis. His research focuses on the use of sequential analysis, change-point detection, and Bayesian inference in epidemiology, clinical trials, cyber security, energy, finance, and semiconductor manufacturing. He received a Ph.D. in statistics from the University of Maryland.

Powered by Koha