Faster Region-based Hotspot Detection Ran Chen 1 , Wei Zhong 2 , - - PowerPoint PPT Presentation

faster region based hotspot detection
SMART_READER_LITE
LIVE PREVIEW

Faster Region-based Hotspot Detection Ran Chen 1 , Wei Zhong 2 , - - PowerPoint PPT Presentation

Faster Region-based Hotspot Detection Ran Chen 1 , Wei Zhong 2 , Haoyu Yang 1 , Hao Geng 1 , Xuan Zeng 3 , Bei Yu 1 1 The Chinese University of Hong Kong 2 Dalian University of Technology 3 Fudan University 1 / 16 Lithography Hotspot Detection


slide-1
SLIDE 1

Faster Region-based Hotspot Detection

Ran Chen1, Wei Zhong2, Haoyu Yang1, Hao Geng1, Xuan Zeng3, Bei Yu1

1The Chinese University of Hong Kong 2Dalian University of Technology 3Fudan University

1 / 16

slide-2
SLIDE 2

Lithography Hotspot Detection

Pre-OPC Layout Post-OPC Mask Hotspot on Wafer

◮ RET: OPC, SRAF, MPL ◮ Still hotspot: low fidelity patterns ◮ Simulations: extremely CPU intensive

Ra#o%of%lithography%simula#on%#me% (normalized%by%40nm%node)% Technology%node

Required(computa/onal( /me(reduc/on!

2 / 16

slide-3
SLIDE 3

Previous Solution

Region

Conventional Hotspot Detector Hotspot Non- Hotspot Clips

◮ A binary classification problem. ◮ Scan over whole region. ◮ Single stage detector. ◮ Scanning is time consuming and single stage is not robust to false alarm.

3 / 16

slide-4
SLIDE 4

Region based approach

Region

Hotspot Core

Region-based Hotspot Detector Feature Extraction Clip Proposal Network Refinement

◮ Learning what and where is hotspot at same time. ◮ Classification Problem -> Classification & Regression Problem.

4 / 16

slide-5
SLIDE 5

Feature Extraction

Inception A Inception B feature map

1 × 1 1 × 1 1 × 1 1 × 1 1 × 1 1 × 1 3 × 3 3 × 3 3 × 3 3 × 3 3 × 3 3 × 3 1 × 1 1 × 1 3 × 3 3 × 3

concat feature map

1 × 1 1 × 1 1 × 1 1 × 1 3 × 3 3 × 3 3 × 3 3 × 3 3 × 3 3 × 3 3 × 3 3 × 3

concat Encoder-Decoder

A A B A A A A

Inception Based Extractor

Deconvolution Convolution Pooling Inception

Encoder-decoder preprocess

◮ Symmetric Structure for feature

encoding and decoding.

◮ Much faster than discrete cosine

transformation. Inception based structure

◮ Multi threads feature extraction. ◮ Prune the depth of the output channel

for each stage.

◮ Downsample the feature map size in

height and width direction.

5 / 16

slide-6
SLIDE 6

Clip Proposal Network

Definition ◮ Clip: Predefined box to crop hotspot features in region. ◮ Proposal: Selected clip which contribute to classification and regression. ◮ Based on extracted features, Clip

Proposal Network is designed to locate and classify hotspots.

◮ Classification and regression branches

share features.

Regression: Classification: W H C

Clip 1

Clip 2 Clip 12

W H C

Clip 1 x y w h Clip 12 x y w h

Input Feature Map 6 / 16

slide-7
SLIDE 7

Details on Clip Proposal Network

◮ To a classifier, we have to balance the positive and negative samples. ◮ As a regression task on location, we need to select reasonable clips as proposals. ◮ We also need to consider efficiency and quality of features. Solutions ◮ Clip Pruning ◮ Hotspot Non Maximum Suppression.

7 / 16

slide-8
SLIDE 8

Details on Clip Proposal Network

Intersection over Union (IoU) IoU = clipgroundtruth clipgenerated clipgroundtruth clipgenerated .

Clip generation: generate group of clips with different aspect ratios and scales in dense.

◮ Number of clips: w × h × clips per location

Clip Pruning before Classification and Regression.

◮ IoU > 0.7, reserved as positive sample; ◮ IoU with any ground truth highest score should be reserved as positive sample; ◮ IoU < 0.3, reserved as negative sample; ◮ Rest of clips do no contribution to the network training.

8 / 16

slide-9
SLIDE 9

Details on Clip Proposal Network

◮ Hotspot Non maximum suppression ◮ CS: classification score ◮ Take advantage of the structural relation between core region and clips ◮ Avoid error dropout during the training

CS: 0.9 CS: 0.8 CS: 0.5 CS: 0.9 CS: 0.8

(a)

CS: 0.9 CS: 0.8 CS: 0.5

(b)

Examples of (a) conventional non-maximum suppression, and (b) the proposed hotspot non-maximum suppression.

9 / 16

slide-10
SLIDE 10

Loss Function Design

◮ Regression Loss for target i: lloc(li, l′

i) =

   1 2(li − l′

i)2,

if |li − l′

i| < 1,

|li − l′

i| − 0.5,

  • therwise,

(1)

◮ Classification Loss for target i: lhotspot(hi, h

i) = −(hi log h

i + h

i log hi).

(2)

10 / 16

slide-11
SLIDE 11

Refinement

Classified as non-hotspot Classified as hotspot Unclassified (a) (b)

(a) 1st hotspot classification in clip proposal network; (b) The labelled hotspots are fed into 2nd hotspot classification in refinement stage to reduce false alarm.

◮ We get a rough prediction with the clip proposal network. ◮ Refinement stage is applied to further decrease the false alarm.

11 / 16

slide-12
SLIDE 12

Refinement

B A

Inception

A

RoI Pooling Regression Classification FC

2nd C&R

◮ RoI (Region of Interest) Pooling is a resize operation to transform feature maps to fixed

size.

◮ Only clips selected from first stage contribute to refinement.

12 / 16

slide-13
SLIDE 13

Experimental Result

◮ Benchmarks from ICCAD Contest 2016. ◮ Ground truth hotspot locations are label according to the results of industrial 7nm metal

layer EUV lithography simulation under a given process window.

◮ No defects found with lithography simulation on Case1.

Bench TCAD’18∗ Faster R-CNN† SSD‡ Ours Accu (%) FA Time (s) Accu (%) FA Time (s) Accu (%) FA Time (s) Accu (%) FA Time (s)

Case2

77.78 48 60.0 1.8 3 1.0 71.9 519 1.0 93.02 17 2.0

Case3

91.20 263 265.0 57.1 74 11.0 57.4 1730 3.0 94.5 34 10.0

Case4

100.00 511 428.0 6.9 69 8.0 77.8 275 2.0 100.00 201 6.0 Average 89.66 274.0 251.0 21.9 48.7 6.67 69.0 841.3 2.0 95.8 84 6.0 Ratio 1.00 1.00 1.00 0.24 0.18 0.03 0.87 3.07 0.01 1.07 0.31 0.02

∗Haoyu Yang et al. (2018). “Layout hotspot detection with feature tensor generation and deep biased

learning”. In: IEEE TCAD.

†Shaoqing Ren et al. (2015). “Faster R-CNN: Towards real-time object detection with region proposal

networks”. In: Proc. NIPS, pp. 91–99.

‡Wei Liu et al. (2016). “SSD: Single shot multibox detector”. In: Proc. ECCV, pp. 21–37.

13 / 16

slide-14
SLIDE 14

Experimental Result

False Alarm Detected Hotspot Missed Hotspot

(a) Ground-truth (b) TCAD’18 (c) Ours

Visualization of different hotspot detection results.

14 / 16

slide-15
SLIDE 15

Ablation Study

60 70 80 90 100

(a) Accuracy (%)

50 100 150 200

(b) False Alarm w/o. ED w/o. L2 w/o. Refine Full

Comparison among different settings on (a) average accuracy and (b) average false alarm.

15 / 16

slide-16
SLIDE 16

Thank You

16 / 16