QUANTIFIABLE COMPARATIVE EVALUATION OF FIB/SEM INSTRUMENTS Valery - - PowerPoint PPT Presentation

quantifiable comparative evaluation of fib sem instruments
SMART_READER_LITE
LIVE PREVIEW

QUANTIFIABLE COMPARATIVE EVALUATION OF FIB/SEM INSTRUMENTS Valery - - PowerPoint PPT Presentation

QUANTIFIABLE COMPARATIVE EVALUATION OF FIB/SEM INSTRUMENTS Valery Ray (1**) , Joshua Taillon (2* ) , Lourdes Salamanca-Riba (2***) (1) PBS&T, Methuen, MA; (2) University of Maryland, College Park, MD (*) Present address: Material Measurement


slide-1
SLIDE 1

10th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017

QUANTIFIABLE COMPARATIVE EVALUATION OF FIB/SEM INSTRUMENTS

Valery Ray(1**), Joshua Taillon(2*), Lourdes Salamanca-Riba(2***)

(1) PBS&T, Methuen, MA; (2) University of Maryland, College Park, MD (*) Present address: Material Measurement Laboratory, NIST, Gaithersburg, MD (**) vray@partbeamsystech.com, (***) riba@umd.edu

slide-2
SLIDE 2

10th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017

Outline

  • Challenges of FIB/SEM equipment evaluation
  • Quantifiable comparative testing approach
  • Design of tests targeting intended applications
  • Practical Examples
  • Summary

2

slide-3
SLIDE 3

10th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017

Challenges of Evaluating FIB/SEM Instruments

  • Complexity of translating application needs into

instrumentation requirements and evaluation criteria

  • There are no “bad” instruments out there
  • OEM engineers are highly skilled with demonstrations
  • Outcome of same operation for average user could be very different
  • “Canned Demo” approach by OEMs
  • Designed to demonstrate strong sides
  • Art of crafting specifications – “specsmanship”
  • Critical (for your application) performance parameters

could be “confidential”

  • Sometimes for a reason of not being known, defined, or ever tested

3

slide-4
SLIDE 4

10th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017

Quantifiable Comparative Testing Approach

  • Identify range of applications for intended usage
  • Translate application goals into instrumentation requirements
  • Design comparative tests, define evaluation criteria
  • Test descriptions and samples to all vendors as early as possible
  • Comprehensive evaluation based for intended use:
  • Quantifiable testing of critical performance parameters
  • Based on pre-defined evaluation criteria
  • Applications demo
  • Overall performance in 3D applications, TEM lamella prep, etc…
  • Two-day evaluation is reasonable to get all the data

4

slide-5
SLIDE 5

10th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017

Tests targeting intended applications

  • General Performance
  • Beam quality; System stability; Aperture repeatability
  • Patterning
  • Beam placement; Etching fidelity; Beam drifts and shifts
  • TEM lamella preparation
  • Throughput; Thickness uniformity; Ease of use; Automation; Endpoint
  • FIB Tomography 3D slice-n-view
  • Unattended runtime; Image quality; Throughput; Ease of use; Drift

Correction; Focus Tracking; Slice thickness uniformity; EDS integration

  • Imaging
  • SEM SE, SEM BSE, STEM BF, STEM DF, FIB SE, FIB SI…..

5

slide-6
SLIDE 6

10th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017

Samples for comparative evaluation

  • Performance Testing

6

  • Application Testing

Same sample(s) to all vendors, require return of test sample(s) for independent analysis

SiO2 Etching Profiles Aperture Repeatability TEM lift-out Deposition Profiles

Epoxy-impregnated Solid Electrolyte Fuel Cell (SOFC), 2-phase ceramic SiO2 optical flat, ~24nm evaporated Al coating, silver paint around perimeter

slide-7
SLIDE 7

10th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017

General Performance – Beam Quality

7

  • Basic test of ion beam quality: shape and homogeneity

Inhomogeneous beam Systematic high-current problem

Single aperture problem

Aperture size increase Aperture size increase Dose increase Aperture size increase Aperture size increase

slide-8
SLIDE 8

10th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017

General Performance – System Stability

8

  • Basic test of tool stability

20 µm 20 µm

L-shaped single-line pattern and 21 beam burns on each arm with 1um offset along the line and 5 minutes delay between burns

1st pair

1 µm

HVAC failure Typical system drift “Quite not bad”

slide-9
SLIDE 9

10th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017

Critical Performance – Etch Placement & Fidelity

1um 1pA 3pA 10pA

Gas

Gas

Gas

0.2 nC/µm2 0.6 nC/µm2 1.8 nC/µm2

0.2 nC/µm2 0.6 nC/µm2 1.8 nC/µm2

0.2 nC/µm2 0.6 nC/µm2 1.8 nC/µm2

Shortest dwell time, -20% pixel overlap, x2 pattern repeats: (a) sputtering/GAE (XeF2) and (b) sputtering/depo (C, Pt, W)

9

slide-10
SLIDE 10

10th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017

Critical Performance – Etching Placement

10

  • Patterning where intended with/without gas injection
  • C or Pt stripe e-beam deposited across lines, TEM lamella

prepared and STEM-imaged as part of application testing

Expected performance Visible artifacts

Aperture change shift

problem for multiple- current patterning

Shift due to gas

problem for site-specific deposition and GAE

Drift after aperture change

Problem for automatic patterning

Drift during line exposure

slide-11
SLIDE 11

10th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017

Critical Performance – Etching Fidelity

11

Sidewall slope and No-Gas profile aspect ratio define polishing efficiency Al layer removal indicates beam tails damage to surface Narrowest cut defined by width

  • f a tip of No-Gas

etching profile No-Gas to GAE profile area ratio defines GAE enhancement No-Gas No-Gas GAE GAE

Al intact Al removed beam tails damage Al intact Al removed beam tails damage

slide-12
SLIDE 12

10th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017

Application Testing – SOFC imaging

SE BSE

12

  • Side-by-side comparison of same sample imaging
slide-13
SLIDE 13

10th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017

Application Testing – 3D Reconstruction

13

  • Fix experimental

parameters between vendors:

  • Run overnight, if possible
  • Results to evaluate:
  • Total running time (limited

by stability)

  • Usable acquisition

volume/hour

  • Acquired image quality
  • Output/ease of use of 3D

visualization software

Slice thickness Image resolution Dwell time Detector settings

Example of vendor visualization output

slide-14
SLIDE 14

10th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017

Summary

  • Quantifiable testing approach enables comparative

evaluation of FIB/SEM instruments by collecting performance data under controlled conditions

  • Careful sample preparation, thorough test design, and demo planning
  • Seamless integration of performance tests with

applications demo facilitates comprehensive evaluation

  • providing OEMs opportunity to showcase strong features of the equipment
  • while allowing side-by-side comparison of critical performance parameters
  • There are no “bad” tools, but nobody is perfect either
  • Interpret test results in context of realistic application requirements

14