Prof. Dr. Sonja Kuhnt

associated member

Mathematical Statistics
Dortmund University of Applied Sciences and Arts

Contact

Hub
  • Metamodel-based optimization of shift planning in high-bay warehouse operations
    Kirchhoff, D. and Kirberg, M. and Kuhnt, S. and Clausen, U.
    Quality and Reliability Engineering International (2022)
    Gaussian process (GP) models of time-consuming computer simulations are nowadays widely used within metamodel-based optimization. In recent years, GP models with mixed inputs have been proposed to handle both numerical and categorical inputs. Using a case study of a high-bay warehouse, we demonstrate the use of GP models with low-rank correlation (LRC) kernels in the context of efficient global optimization (EGO). As is common in many logistics applications, the high-bay warehouse is modeled with a discrete-event simulation model. Input variables include, for example, the choice between different task assignment strategies. A shift scheduling problem is considered in which personnel and energy costs as well as the delay of tasks are to be minimized at the same time. Evaluations of an initial experimental design provide a first approximation of the Pareto front, which we manage to extend substantially within only 15 iterations of identifying new points using the expected hypervolume improvement (EHI) and the S metric selection (SMS) criteria. We penalize the criteria in the last five iterations using the known total costs of proposed points to guide the search towards a more desired area. The resulting Pareto front approximation provides a selection of shift plans that have different characteristics. This enables decision makers in practice to choose a shift plan with desirable features. © 2022 The Authors. Quality and Reliability Engineering International published by John Wiley & Sons Ltd.
    view abstract10.1002/qre.3207
  • Statistical Comparison of Processing Different Powder Feedstock in an HVOF Thermal Spray Process
    Tillmann, W. and Kuhnt, S. and Baumann, I.T. and Kalka, A. and Becker-Emden, E.-C. and Brinkhoff, A.
    Journal of Thermal Spray Technology 31 (2022)
    Cermet coatings such as WC-Co and Cr3C2-NiCr are frequently applied by means of thermal spray processes to protect highly stressed surfaces against wear. The investigation of the respective spray materials and their coating properties and in-flight particle properties are often carried out in separate experiments. In this study, the coating characteristics (hardness, deposition rate, porosity, thickness) and in-flight particle properties (particle velocity and temperature) of three different WC-based powders and a Cr3C2-NiCr powder processed by means of an HVOF process are investigated as a function of some key process parameters such as kerosene flow rate, lambda, spray distance and feeder disc velocity. These parameters were varied within a design of experiments, whilst all other parameters were fixed. Both the design of experiments plan and the settings of the fixed parameters were defined identically. The in-flight particle properties and coating characteristics are statistically modeled as a function of the process parameters and their influences are compared. A well-selected, limited number of experimental runs using statistical design of experiment (DoE) enable this comparison. The deployed statistical models are generalized linear models with Gamma-distributed responses. The models show that particle velocity and particle temperature mainly depend on kerosene flow rate and spray distance. However, in the case of particle temperature, the model coefficients for Cr3C2-NiCr and WC powders have different signs, reflecting different qualitative behavior. © 2022, The Author(s).
    view abstract10.1007/s11666-022-01392-2
  • Use of optimal mixture-process designs and response-surface models to study properties of calcium silicate units
    Kuhnt, S. and Becker-Emden, E.-C. and Schade, T. and Eden, W. and Middendorf, B.
    Quality and Reliability Engineering International 37 (2021)
    Calcium silicate units are versatile and widely used construction materials for edifices. Their production process involves several factors that concern either the mixture of the raw materials or the curing process. The understanding of how raw materials and process variables interact in achieving the compressive strength of the final product enables a cost- and energy-efficient layout of the production process. In this paper, we use mixture-process experiments to derive a prediction model for compressive strength. We compare computer-generated D-optimal designs with different numbers of center points by various criteria and by their prediction variance throughout the design space. In contrast to traditional mixture designs, these designs take additional constraints on the mixture components into account and can include process variables. We review suitable response-surface models, which combine mixture and process variables. Based on results from 72 experimental runs, a model for the mean compressive strength is built, combining expert knowledge with statistical model-selection strategies. The resulting model covers not only linear effects of mixture components and process variables but also interactions and quadratic terms. For example, the influence of the lime share on compressive strength differs among the use of various sand mixtures. For desired values of predicted compressive strength, factor settings can thereby be found reducing costs and energy emission. © 2020 John Wiley & Sons Ltd.
    view abstract10.1002/qre.2758
  • Detection of circlelike overlapping objects in thermal spray images
    Kirchhoff, D. and Kuhnt, S. and Bloch, L. and Müller, C.H.
    Quality and Reliability Engineering International 36 (2020)
    In this paper, we present a new algorithm for the detection of distorted and overlapping circlelike objects in noisy grayscale images. Its main step is an edge detection using rotated difference kernel estimators. To the resulting estimated edge points, circles are fitted in an iterative manner using a circular clustering algorithm. A new measure of similarity can assess the performance of algorithms for the detection of circlelike objects, even if the number of detected circles does not coincide with the number of true circles. We apply the algorithm to scanning electron microscope images of a high-velocity oxygen fuel (HVOF) spray process, which is a popular coating technique. There, a metal powder is fed into a jet, gets accelerated and heated up by means of a mixture of oxygen and fuel, and finally deposits as coating upon a substrate. If the process is stopped before a continuous layer is formed, the molten metal powder solidifies in form of small, almost circular so-called splats, which vary with regard to their shape, size, and structure and can overlap each other. As these properties are challenging for existing image processing algorithms, engineers analyze splat images manually up to now. We further compare our new algorithm with a baseline approach that uses the Laplacian of Gaussian blob detection. It turns out that our algorithm performs better on a set of test images of round, spattered, and overlapping circles. © 2020 The Authors. Quality and Reliability Engineering International published by John Wiley & Sons Ltd
    view abstract10.1002/qre.2689
  • Support indices: Measuring the effect of input variables over their supports
    Fruth, J. and Roustant, O. and Kuhnt, S.
    Reliability Engineering and System Safety 187 (2019)
    Two new sensitivity indices are presented which give an original solution to the question in sensitivity analysis of how to determine regions within the input space for which the model variation is high. The indices, as functions over the input domain, give insight into the local influence of input variables over the whole domain when the other variables lie in the global domain. They can serve as an informative extension to a standard analysis and in addition are especially helpful in the specification of the input domain, a critical, but often vaguely handled issue in sensitivity analysis. In the usual framework of independent continuous input variables, we present theoretical results that show an asymptotic connection between the presented indices and Sobol’ indices, valid for general probability distribution functions. Finally, we show how the indices can be successfully applied on analytical examples and on a real application. © 2018
    view abstract10.1016/j.ress.2018.07.026
  • Optimal designs for thermal spraying
    Dette, H. and Hoyden, L. and Kuhnt, S. and Schorning, K.
    Journal of the Royal Statistical Society. Series C: Applied Statistics 66 (2017)
    We consider the problem of designing additional experiments to update statistical models for latent day specific effects. The problem appears in thermal spraying, where particles are sprayed on surfaces to obtain a coating. The relationships between in-flight properties of the particles and the controllable variables are modelled by generalized linear models. However, there are also non-controllable variables, which may vary from day to day and are modelled by day-specific additive effects. Existing generalized linear models for properties of the particles in flight must be updated on a limited number of additional experiments on a different day. We develop robust D-optimal designs to collect additional data for an update of the day effects, which are efficient for the estimation of the parameters in all models under consideration. The results are applied to the thermal spraying process and a comparison of the statistical analysis based on a reference design as well as on a selected Bayesian D-optimal design is performed. © 2016 Royal Statistical Society
    view abstract10.1111/rssc.12156
  • An angle-based multivariate functional pseudo-depth for shape outlier detection
    Kuhnt, S. and Rehage, A.
    Journal of Multivariate Analysis 146 (2016)
    A measure especially designed for detecting shape outliers in functional data is presented. It is based on the tangential angles of the intersections of the centred data and can be interpreted like a data depth. Due to its theoretical properties we call it functional tangential angle (FUNTA) pseudo-depth. Furthermore we introduce a robustification (rFUNTA). The existence of intersection angles is ensured through the centring. Assuming that shape outliers in functional data follow a different pattern, the distribution of intersection angles differs. Furthermore we formulate a population version of FUNTA in the context of Gaussian processes. We determine sample breakdown points of FUNTA and compare its performance with respect to outlier detection in simulation studies and a real data example. © 2015 Elsevier Inc.
    view abstract10.1016/j.jmva.2015.10.016
  • Numerical algebraic fan of a design for statistical model building
    Rudak, N. and Kuhnt, S. and Riccomagno, E.
    Statistica Sinica 26 (2016)
    An important issue in the design of experiments is the question of identifiability of models. This paper deals with a modelling process, where linear modeling goes beyond the simple relationship between input and output variables. Observations or predictions from the chosen experimental design are themselves input variables for an eventual output. Tools developed to analyze designs from algebraic statistics are extended to noisy, irregular designs. They enable an advanced study of model identifiability. Model building is opened towards higher order interactions rather than restricting the class of considered models to main effects or two-way interactions only. The new approach is compared to classical model building strategies in an application to a thermal spraying process.
    view abstract10.5705/ss.2014.264
  • Residual Analysis in Generalized Function-on-Scalar Regression for an HVOF Spraying Process
    Kuhnt, S. and Rehage, A. and Becker-Emden, C. and Tillmann, W. and Hussong, B.
    Quality and Reliability Engineering International 32 (2016)
    The coating of materials plays an important role in various fields of engineering. Essential properties such as wear protection can be improved by a suitable coating technique. One of these techniques is high-velocity oxygen-fuel spraying. A drawback of high-velocity oxygen-fuel spraying is that it lacks reproducibility due to effects which are hard to measure directly. However, coating powder particles are observable over time during their flight towards the material and contain valuable information about the state of the process. Because of their smooth nature, measures of temperature and velocity can be assumed as target variables in generalized function-on-scalar regression. We propose methods to perform residual analysis in this framework aiming at the detection of individual residual functions which deviate from the majority of residuals. These methods help to detect anomalies in the process and hence improve the estimators. Functional target variables result in functional residuals whose analysis is barely explored. One reason might be that ordinary residual plots should be inspected at each observed point in time. We circumvent this infeasible procedure by the use of functional depths that help to identify unusual residuals and thereby gain deeper insight of the data-generating process. In a simulation study, we find that a good depth for detecting trend outliers is the h-modal depth as long as the link function is chosen correctly. In case of shape outliers rFUNTA pseudo-depth performs well. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
    view abstract10.1002/qre.2018
  • Sequential designs for sensitivity analysis of functional inputs in computer experiments
    Fruth, J. and Roustant, O. and Kuhnt, S.
    Reliability Engineering and System Safety 134 (2015)
    Computer experiments are nowadays commonly used to analyze industrial processes aiming at achieving a wanted outcome. Sensitivity analysis plays an important role in exploring the actual impact of adjustable parameters on the response variable. In this work we focus on sensitivity analysis of a scalar-valued output of a time-consuming computer code depending on scalar and functional input parameters. We investigate a sequential methodology, based on piecewise constant functions and sequential bifurcation, which is both economical and fully interpretable. The new approach is applied to a sheet metal forming problem in three sequential steps, resulting in new insights into the behavior of the forming process over time. © 2014 Elsevier Ltd. All rights reserved.
    view abstract10.1016/j.ress.2014.07.018
  • Simultaneous Optimization of Multiple Correlated Responses with Application to a Thermal Spraying Process
    Rudak, N. and Hussong, B. and Kuhnt, S.
    Quality and Reliability Engineering International 31 (2015)
    In industrial applications, it is often desired to find settings of process parameters, which lead to pre-specified target values of multiple quality characteristics with minimal variance. One approach to solve this problem is to minimize an estimated risk function depending on a cost matrix. The joint optimization (JOP) method follows this general strategy using a sequence of diagonal cost matrices and requires estimated models for the expectation and the variance of the responses. However, if the quality characteristics might be correlated, this should be considered at the model or optimization stage in order to find a realistic solution. In this contribution, we extend the JOP method to the simultaneous optimization of correlated multiple responses. We also introduce a new approach for the choice of non-diagonal cost matrices. The resulting JOP method for correlated responses is illustrated on an application arising in the field of thermal spraying. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
    view abstract10.1002/qre.1837
  • A Parallel Optimization Algorithm based on FANOVA Decomposition
    Ivanov, M. and Kuhnt, S.
    Quality and Reliability Engineering International 30 (2014)
    The analysis and modeling of complex industrial processes, like the forming of car parts, is often performed with the help of computer simulations. Optimization of such computer experiments usually relies on metamodel-based sequential strategies. The existing sequential algorithms, however, share the limitation that they only allow a single simulation at a time. In this article, we present a very elegant way to produce a parallel optimization procedure, based on a technique from the sensitivity analysis toolbox-the functional analysis of variance graph. The proposed novel simultaneous optimization scheme is called the ParOF algorithm. It is compared with a very effective black-box procedure-the well known efficient global optimization (EGO) algorithm, based on analytical test cases and an optimization study of a sheet forming simulation. Besides demonstrating the advantages of our parallel optimization method, the results show that it can successfully be applied to sheet metal forming for the purpose of quality improvement of the ready parts. © 2014 John Wiley & Sons, Ltd.
    view abstract10.1002/qre.1710
  • Crossed-derivative based sensitivity measures for interaction screening
    Roustant, O. and Fruth, J. and Iooss, B. and Kuhnt, S.
    Mathematics and Computers in Simulation 105 (2014)
    Global sensitivity analysis is used to quantify the influence of input variables on a numerical model output. Sobol' indices are now classical sensitivity measures. However their estimation requires a large number of model evaluations, especially when interaction effects are of interest. Derivative-based global sensitivity measures (DGSM) have recently shown their efficiency for the identification of non-influential inputs. In this paper, we define crossed DGSM, based on second-order derivatives of model output. By using a L2-Poincaré inequality, we provide a crossed-DGSM based maximal bound for the superset importance (i.e. total Sobol' indices of an interaction between two inputs). In order to apply this result, we discuss how to estimate the Poincaré constant for various probability distributions. Several analytical and numerical tests show the performance of the bound and allow to develop a generic strategy for interaction screening. © 2014 IMACS. Published by Elsevier B.V. All rights reserved.
    view abstract10.1016/j.matcom.2014.05.005
  • On- and offline detection of structural breaks in thermal spraying processes
    Borowski, M. and Rudak, N. and Hussong, B. and Wied, D. and Kuhnt, S. and Tillmann, W.
    Journal of Applied Statistics 41 (2014)
    We investigate and develop methods for structural break detection, considering time series from thermal spraying process monitoring. Since engineers induce technical malfunctions during the processes, the time series exhibit structural breaks at known time points, giving us valuable information to conduct the investigations. First, we consider a recently developed robust online (also real-time) filtering (i.e. smoothing) procedure that comprises a test for local linearity. This test rejects when jumps and trend changes are present, so that it can also be useful to detect such structural breaks online. Second, based on the filtering procedure we develop a robust method for the online detection of ongoing trends. We investigate these two methods as to the online detection of structural breaks by simulations and applications to the time series from the manipulated spraying processes. Third, we consider a recently developed fluctuation test for constant variances that can be applied offline, i.e. after the whole time series has been observed, to control the spraying results. Since this test is not reliable when jumps are present in the time series, we suggest data transformation based on filtering and demonstrate that this transformation makes the test applicable. © 2013 © 2013 Taylor & Francis.
    view abstract10.1080/02664763.2013.860957
  • Outlier detection in contingency tables based on minimal patterns
    Kuhnt, S. and Rapallo, F. and Rehage, A.
    Statistics and Computing 24 (2014)
    A new technique for the detection of outliers in contingency tables is introduced, where outliers are unusual cell counts with respect to classical loglinear Poisson models. Subsets of cell counts called minimal patterns are defined, corresponding to non-singular design matrices and leading to potentially uncontaminated maximum-likelihood estimates of the model parameters and thereby the expected cell counts. A criterion to easily produce minimal patterns in the two-way case under independence is derived, based on the analysis of the positions of the chosen cells. A simulation study and a couple of real-data examples are presented to illustrate the performance of the newly developed outlier identification algorithm, and to compare it with other existing methods. © 2013 Springer Science+Business Media New York.
    view abstract10.1007/s11222-013-9382-8
  • Statistical Hypothesis Testing with SAS and R
    Taeger, D. and Kuhnt, S.
    Statistical Hypothesis Testing with SAS and R (2014)
    A comprehensive guide to statistical hypothesis testing with examples in SAS and R. When analyzing datasets the following questions often arise: Is there a short hand procedure for a statistical test available in SAS or R? If so, how do I use it? If not, how do I program the test myself? This book answers these questions and provides an overview of the most common statistical test problems in a comprehensive way, making it easy to find and perform an appropriate statistical test. A general summary of statistical test theory is presented, along with a basic description for each test, including the necessary prerequisites, assumptions, the formal test problem and the test statistic. Examples in both SAS and R are provided, along with program code to perform the test, resulting output and remarks explaining the necessary program parameters. Key features: Provides examples in both SAS and R for each test presented. Looks at the most common statistical tests, displayed in a clear and easy to follow way. Supported by a supplementary website http://www.d-taeger.de featuring example program code. Academics, practitioners and SAS and R programmers will find this book a valuable resource. Students using SAS and R will also find it an excellent choice for reference and data analysis. © 2014 John Wiley & Sons, Ltd. All rights reserved.
    view abstract10.1002/9781118762585
  • The ENBIS-13 Quality and Reliability Engineering International special issue
    Kuhnt, S. and Zempléni, A.
    Quality and Reliability Engineering International 30 (2014)
    view abstract10.1002/qre.1721
  • Total interaction index: A variance-based sensitivity index for second-order interaction screening
    Fruth, J. and Roustant, O. and Kuhnt, S.
    Journal of Statistical Planning and Inference 147 (2014)
    Sensitivity analysis aims at highlighting the input variables that have significant impact on a given model response of interest. By analogy with the total sensitivity index, used to detect the most influential variables, a screening of interactions can be done efficiently with the so-called total interaction index (TII), defined as the superset importance of a pair of variables. Our aim is to investigate the TII, with a focus on statistical inference. At the theoretical level, we derive its connection to total and closed sensitivity indices. We present several estimation methods and prove the asymptotical efficiency of the Liu and Owen estimator. We also address the question of estimating the full set of TIIs, with a given budget of function evaluations. We observe that with the pick-and-freeze method the full set of TIIs can be estimated at a linear cost with respect to the problem dimension. The different estimators are then compared empirically. Finally, an application is given aiming at discovering a block-additive structure of a function, where no prior knowledge is available, neither about the interaction structure nor about the blocks. © 2013 Elsevier B.V.
    view abstract10.1016/j.jspi.2013.11.007
  • Correspondence analysis in the case of outliers
    Langovaya, A. and Kuhnt, S. and Chouikha, H.
    Studies in Classification, Data Analysis, and Knowledge Organization (2013)
    Analysis of categorical data by means of Correspondence Analysis (CA) has recently become popular. The behavior of CA in the presence of outliers in the table is not sufficiently explored in the literature, especially in the case of multidimensional contingency tables. In our research we apply correspondence analysis to three-way contingency tables with outliers, generated by deviations from the independence model. Outliers in our work are chosen in such a way that they break the independence in the table, but still they are not large enough to be easily spotted without statistical analysis. We study the change in the correspondence analysis row and column coordinates caused by the outliers and perform numerical analysis of the outlier coordinates. © Springer-Verlag Berlin Heidelberg 2013.
    view abstract10.1007/978-3-642-28894-4-8
  • Groups acting on Gaussian graphical models
    Draisma, J. and Kuhnt, S. and Zwiernik, P.
    Annals of Statistics 41 (2013)
    Gaussian graphical models have become a well-recognized tool for the analysis of conditional independencies within a set of continuous random variables. From an inferential point of view, it is important to realize that they are composite exponential transformation families. We reveal this structure by explicitly describing, for any undirected graph, the (maximal) matrix group acting on the space of concentration matrices in the model. The continuous part of this group is captured by a poset naturally associated to the graph, while automorphisms of the graph account for the discrete part of the group. We compute the dimension of the space of orbits of this group on concentration matrices, in terms of the combinatorics of the graph; and for dimension zero we recover the characterization by Letac and Massam of models that are transformation families. Furthermore, we describe the maximal invariant of this group on the sample space, and we give a sharp lower bound on the sample size needed for the existence of equivariant estimators of the concentration matrix. Finally, we address the issue of robustness of these estimators by computing upper bounds on finite sample breakdown points. © Institute of Mathematical Statistics, 2013.
    view abstract10.1214/13-AOS1130
  • Influence of parameter variations on WC-Co splat formation in an HVOF process using a new beam-shutter device
    Tillmann, W. and Hussong, B. and Priggemeier, T. and Kuhnt, S. and Rudak, N. and Weinert, H.
    Journal of Thermal Spray Technology 22 (2013)
    The formation of single splats is the foundation for any thermal spray coating. Therefore, this study focuses on the investigation of single splat morphologies to determine the influence of spray parameters on the morphological distribution of particles inside the flame. A new method to create a footprint of a spray jet with an extremely short exposure time was used. The resulting field of splats enabled the assignment of each splat to its radial position in the spray jet. The footprints were analyzed and the quantities and morphologies of the splats were correlated to particle in-flight measurements and coating properties. A strong correlation between the particle velocity, the percentage of the so-called pancake-like splats, and the porosity of the coating could be revealed. The influence of the particle temperature was found to be of minor importance to the splat form and the porosity of the coatings. Still, the particle temperature had a good correlation with the coating hardness and the dissolving of the WC. Measurements of the splat size in different areas of the footprints revealed that the percentage of splats larger than 40 μm in diameter was generally higher in the center of the footprint than in the outer regions. © 2013 ASM International.
    view abstract10.1007/s11666-012-9881-8
  • Robustness and complex data structures: Festschrift in honour of Ursula Gather
    Becker, C. and Fried, R. and Kuhnt, S.
    Robustness and Complex Data Structures: Festschrift in Honour of Ursula Gather (2013)
    This Festschrift in honour of Ursula Gather’s 60th birthday deals with modern topics in the field of robust statistical methods, especially for time series and regression analysis, and with statistical methods for complex data structures. The individual contributions of leading experts provide a textbook-style overview of the topic, supplemented by current research results and questions. The statistical theory and methods in this volume aim at the analysis of data which deviate from classical stringent model assumptions, which contain outlying values and/or have a complex structure. Written for researchers as well as master and PhD students with a good knowledge of statistics. © Springer-Verlag Berlin Heidelberg 2013.
    view abstract10.1007/978-3-642-35494-6
  • The concept of α-outliers in structured data situations
    Kuhnt, S. and Rehage, A.
    Robustness and Complex Data Structures: Festschrift in Honour of Ursula Gather (2013)
    Ever since the first data sets have been collected and analyzed by specialists and scientists, the question of which observations are “normal” and which are not has been asked. There is a considerable amount of uncertainty and opacity in data analyses where authors claim that certain observations do not fit to the rest of the data and have therefore been removed or analyzed more accurately. However, no unique definition of the term “outlier” exists. Numerous proposals for this issue have been made. In this chapter we discuss the model-based concept of α-outliers, which predicates on the density of the assumed probability distribution. © Springer-Verlag Berlin Heidelberg 2013.
    view abstract10.1007/978-3-642-35494-6_6
  • Data-driven Kriging models based on FANOVA-decomposition
    Muehlenstaedt, T. and Roustant, O. and Carraro, L. and Kuhnt, S.
    Statistics and Computing 22 (2012)
    Kriging models have been widely used in computer experiments for the analysis of time-consuming computer codes. Based on kernels, they are flexible and can be tuned to many situations. In this paper, we construct kernels that reproduce the computer code complexity by mimicking its interaction structure. While the standard tensor-product kernel implicitly assumes that all interactions are active, the new kernels are suited for a general interaction structure, and will take advantage of the absence of interaction between some inputs. The methodology is twofold. First, the interaction structure is estimated from the data, using a first initial standard Kriging model, and represented by a so-called FANOVA graph. New FANOVA-based sensitivity indices are introduced to detect active interactions. Then this graph is used to derive the form of the kernel, and the corresponding Kriging model is estimated by maximum likelihood. The performance of the overall procedure is illustrated by several 3-dimensional and 6-dimensional simulated and real examples. A substantial improvement is observed when the computer code has a relatively high level of complexity. © 2011 Springer Science+Business Media, LLC.
    view abstract10.1007/s11222-011-9259-7
  • How to choose the simulation model for computer experiments: A local approach
    Mühlenstädt, T. and Gösling, M. and Kuhnt, S.
    Applied Stochastic Models in Business and Industry 28 (2012)
    In many scientific areas, non-stochastic simulation models such as finite element simulations replace real experiments. A common approach is to fit a meta-model, for example a Gaussian process model, a radial basis function interpolation, or a kernel interpolation, to computer experiments conducted with the simulation model. This article deals with situations where more than one simulation model is available for the same real experiment, with none being the best over all possible input combinations. From fitted models for a real experiment as well as for computer experiments using the different simulation models, a criterion is derived to identify the locally best one. Applying this criterion to a number of design points allows the design space to be split into areas where the individual simulation models are locally superior. An example from sheet metal forming is analyzed, where three different simulation models are available. In this application and many similar problems, the new approach provides valuable assistance with the choice of the simulation model to be used. © 2011 John Wiley & Sons, Ltd.
    view abstract10.1002/asmb.909
  • Asymmetry models for square contingency tables: Exact tests via algebraic statistics
    Krampe, A. and Kateri, M. and Kuhnt, S.
    Statistics and Computing 21 (2011)
    Square contingency tables with the same row and column classification occur frequently in a wide range of statistical applications, e. g. whenever the members of a matched pair are classified on the same scale, which is usually ordinal. Such tables are analysed by choosing an appropriate loglinear model. We focus on the models of symmetry, triangular, diagonal and ordinal quasi symmetry. The fit of a specific model is tested by the chi-squared test or the likelihood-ratio test, where p-values are calculated from the asymptotic chi-square distribution of the test statistic or, if this seems unjustified, from the exact conditional distribution. Since the calculation of exact p-values is often not feasible, we propose alternatives based on algebraic statistics combined with MCMC methods. © 2009 Springer Science+Business Media, LLC.
    view abstract10.1007/s11222-009-9146-7
  • Joint optimization of independent multiple responses
    Erdbrügge, M. and Kuhnt, S. and Rudak, N.
    Quality and Reliability Engineering International 27 (2011)
    Most of the existing methods for the analysis and optimization of multiple responses require some kinds of weighting of these responses, for instance in terms of cost or desirability. Particularly at the design stage, such information is hardly available or will rather be subjective. An alternative strategy uses loss functions and a penalty matrix that can be decomposed into a standardizing (data-driven) and a weight matrix. The effect of different weight matrices is displayed in joint optimization plots in terms of predicted means and variances of the response variables. In this article, we propose how to choose weight matrices for two and more responses. Furthermore, we prove the Pareto optimality of every point that minimizes the conditional mean of the loss function. © 2011 John Wiley & Sons, Ltd.
    view abstract10.1002/qre.1229
  • Kernel interpolation
    Mühlenstädt, T. and Kuhnt, S.
    Computational Statistics and Data Analysis 55 (2011)
    Surrogate interpolation models for time-consuming computer experiments are being increasingly used in scientific and engineering problems. A new interpolation method, based on Delaunay triangulations and related to inverse distance weighting, is introduced. This method not only provides an interpolator but also uncertainty bands to judge the local fit, in contrast to methods such as radial basis functions. Compared to the classical Kriging approach, it shows a better performance in specific cases of small data sets and data with non-stationary behavior. © 2011 Elsevier B.V. All rights reserved.
    view abstract10.1016/j.csda.2011.05.001
  • Strategies for springback compensation regarding process robustness
    Gösling, M. and Kracker, H. and Brosius, A. and Kuhnt, S. and Tekkaya, A.E.
    Production Engineering 5 (2011)
    In this article, strategies which compensate geometrical deviations caused by springback are discussed using finite element simulations and statistical modelling techniques. First of all the ability to predict springback using a finite element simulation model is analysed. For that purpose numerical predictions and experiments are compared with each other regarding the amount of springback. In a next step, different strategies for compensating springback such as a modification of stress condition, component stiffness and tool geometry are introduced. On the basis of finite element simulations these different compensation strategies are illustrated for a stretch bending process and experimentally checked for an example. Finally springback simulations are compared regarding their robustness against noise variables such as friction and material properties. Thereby a method based on statistical prediction models is introduced which allows for an accurate approximation of the springback distribution with less numerical effort in comparison to a classical Monte-Carlo method. © 2010 German Academic Society for Production Engineering (WGP).
    view abstract10.1007/s11740-010-0251-4
  • Breakdown concepts for contingency tables
    Kuhnt, S.
    Metrika 71 (2010)
    Loglinear Poisson models are commonly used to analyse contingency tables. So far, robustness of parameter estimators as well as outlier detection have rarely been treated in this context. We start with finite-sample breakdown points. We yield that the breakdown point of mean value estimators determines a lower bound for a masking breakdown point of a class of one-step outlier identification rules. Within a more refined breakdown approach, which takes account of the structure of the contingency table, a stochastic breakdown function is defined. It returns the probability that a given proportion of outliers is randomly placed at such a pattern, where breakdown is possible. Finally, the introduced breakdown concepts are applied to characterise the maximum likelihood estimator and a median-polish estimator. © 2009 Springer-Verlag.
    view abstract10.1007/s00184-008-0230-3
  • Design and analysis of computer experiments
    Kuhnt, S. and Steinberg, D.M.
    AStA Advances in Statistical Analysis 94 (2010)
    The design and analysis of computer experiments as a relatively young research field is not only of high importance for many industrial areas but also presents new challenges and open questions for statisticians. This editorial introduces a special issue devoted to the topic. The included papers present an interesting mixture of recent developments in the field as they cover fundamental research on the design of experiments, models and analysis methods as well as more applied research connected to real-life applications. © 2010 Springer-Verlag.
    view abstract10.1007/s10182-010-0143-0
  • Exact confidence intervals for odds ratios with algebraic statistics
    Krampe, A. and Kuhnt, S.
    Studies in Classification, Data Analysis, and Knowledge Organization (2010)
    Odds ratios, which compare the odds of an event occurring in the presence of a potential risk-factor to the odds of it occurring in the absence of the potential risk-factor, are commonly used in medical and social science research. Confidence intervals usually rely on approximate results or exact enumeration. We suggest an algebraic solution to this problem which is of particular use in situations where the approximation is not adequate and exact enumerations are computationally too costly. This algebraic approach relies on the Diaconis-Sturmfels algorithm which combines computational commutative algebra and Markov chain Monte Carlo methods to simulate samples of the conditional distribution of a discrete exponential family with given sufficient statistic. In particular a Groebner basis is used for the construction of the Markov chain. In a simulation study we determine and compare the simulated, exact and approximate results. © 2010 Springer-Verlag Berlin Heidelberg.
    view abstract10.1007/978-3-642-10745-0-43
  • Information acquisition for modelling and simulation of logistics networks
    Kuhnt, S. and Wenzel, S.
    Journal of Simulation 4 (2010)
    Design, organisation and management of Large Logistics Networks (LLN) usually involve model-based analyses of the networks. The usefulness of such an analysis highly depends on the quality of the input data, which of course should be best possible to capture the real circumstances. In this paper, an advanced procedure model for a structured, goal- and task-oriented information and data acquisition for the model-based analyses of LLN is proposed. This procedure model differs from other approaches by focussing on information acquisition rather than solely on data acquisition, and by employing a consequent verification and validation concept. All steps of the procedure modelGoal Setting, Information Identification, Preparation of Information and Data Collection, Information and Data Collection, Data Recording, Data Structuring, Statistical Data Analysis, Data Usability Test-are described and exemplified for a network of air-freight-flow. © 2010 Operational Research Society Ltd. All rights reserved.
    view abstract10.1057/jos.2009.9
  • Uncertainty in Gaussian Process Interpolation
    Kracker, H. and Bornkamp, B. and Kuhnt, S. and Gather, U. and Ickstadt, K.
    Recent Developments in Applied Probability and Statistics (2010)
    In this article, we review a probabilistic method for multivariate interpolation based on Gaussian processes. This method is currently a standard approach for approximating complex computer models in statistics, and one of its advantages is the fact that it accompanies the predicted values with uncertainty statements. We focus on investigating the reliability of the method's uncertainty statements in a simulation study. For this purpose we evaluate the effect of different objective priors and different computational approaches. We illustrate the interpolation method and the practical importance of uncertainty quantification in interpolation in a sequential design application in sheet metal forming. Here design points are added sequentially based on uncertainty statements.
    view abstract10.1007/978-3-7908-2598-5_4

« back