+44 (0)24 7671 8970
More publications     •     Advertise with us     •     Contact us
 
Technical Insight

Eradicating errors in GaAs fabs

To increase yield and throughput in high-volume GaAs fabs, engineers should obtain more data from suppliers, introduce better approaches to analysing process data, and understand how different statistical methods handle outliers.
BY RICHARD STEVENSON

Even for an established GaAs fab, it is far from easy to master high-volume manufacture of a high-class product. One of the keys to success is to develop a competitive prototype that appeals to customers and leads to substantial orders. But that isn't all that is needed. Generating a healthy profit margin month-in, month-out requires engineers to guard against anything that could lead to a drop in yield.

Several approaches that can be taken to ensure a high yield in a GaAs fab were discussed at this year's CS Man Tech Conference. At this meeting, held in Miami from 16-19 May, speakers described: the benefits resulting from greater information about incoming materials; a route to more insightful, speedier trouble-shooting via a new approach to handling process tool data; and the most appropriate methods for treating outliers within data sets.

Working with suppliers

When Marie Le Guilly took over as supplier quality engineer for epiwafers and metals at Qorvo's Hillsboro fab she was in no doubt that there had to be greater control of incoming materials: "I needed the supplier data in order to trouble shoot issues quickly and continuously learn," says Le Guilly, who believes that speed is vital to Qorvo's business. "We continuously make product, so if there is a material issue, we need to contain it and resolve it very quickly."

Before adopting this approach, there were many instances when product failures could be traced back to material that was compromised, but still met a purchasing specification. To try and address this loophole, Le Guilly convinced suppliers to share their data that they kept internally. While a small shift in one characteristic of a material may not concern suppliers, it can have a profound impact on device performance.

Although some suppliers have been more reluctant than others at disclosing information, they are now sharing their data "“ and doing so without increasing their prices.

Figure 1. Qorvo's supplier of titanium caught the uptick in oxygen content in the titanium target, thanks to the ship-to-control specification. The hike was traced back to the titanium powder supply. USL is the upper shipping limit and UCL is the upper control limit.

From a supplier's perspective, disclosing more data has pros and cons. One supplier shipped a metal to Qorvo that is outside "˜ship-to-control' limits, but within purchasing specifications. This source is now under evaluation at the Hillboro fab.

"This is putting a burden on the supplier as they wait for our results," admits Le Guilly, who points out that there are also benefits for them "“ the supplier improves internal control of material and control of suppliers.

In the CS Man Tech paper, Le Guilly and co-workers offered two illustrations that highlight the benefits a chipmaker obtains when receiving more data from a supplier. The first example involved a supplier of metals, which present a high risk to a GaAs fab, because a shift in material characteristics may only be detected during electrical testing "“ and at that stage the fab will have processed hundreds of wafers that will have to be subsequently scrapped.

By monitoring its composition, the supplier exposed a hike in oxygen content in its titanium target, and traced the issue back to the raw material supplier. Qorvo evaluated this material "“ it was above the upper control limit but inside the upper shipping limit "“ before releasing it to production.

The second example involved Qorvo's partnership with an epiwafer supplier, which shared a great deal of data that had been previously restricted to internal use. By accessing data that goes beyond what is included in purchasing specifications, Qorvo's engineers have been able to connect supplier data to device parametric data in real time.

Several benefits have resulted. When Qorvo made the transition to a new supplier for MBE pHEMT epiwafer production, correlating epi and parametric data slashed the time taken to establish a ship-to-control specification. When devices started to fail due to a low pinch-off voltage, this could be correlated to a low pinch-off measured by the supplier, and it led to an increase in the ship-to-control specification; and when RF failures were correlated with on-resistance, and traced back to variations in the aluminium content of the Schottky device layer, Qorvo added a photoluminescence measurement to the specifications to control composition.

Figure 2. Process control monitoring of the sheet resistance of an epilayer at WIN Semiconductor revealed an increase in variation in February 2015. Insight provided by an internally developed computer programme, considering "˜analysis of variance', identified the process behind this "“ a post-metal surface clean. Engineers traced the problem to a dilution of a cleaning solution in a particular tool.

Historically, one of the advantages of being a vertically integrated chipmaker, rather than one that outsources the epi, has been the close cooperation that can be fostered between the process and device engineers. But that gap can be bridged when supplier data is handed over to the chipmakers, argues Le Guilly. "[It] enables successful outsourcing." 

Digging in the data

When large GaAs fabs are running close to capacity, they are churning out millions of die and creating a colossal amount of data. So, to try and deal with all that data in an efficient and effective manner "“ and to empower engineers to troubleshoot quickly and successfully "“ a pair of engineers at WIN Semiconductors have spent three years developing an in-house data analysis program.

At WIN the majority of semiconductor processing steps are performed on one of several tools that may be nominally the same, but produce slightly different results. So, to account for these differences, engineers used a test known as analysis-of-variance. It determines the difference produced by a set of tools performing a particular process. By applying this test to all process steps, they can be ranked using statistical criteria. This exposes the steps with the greatest variations.

One of the pair of engineers involved in developing the computer program, Ming-Wei Tsai, spoke about this work at CS Man Tech. He says that the computer program, which was ready about a year ago, is designed for off-line monitoring, rather than real-time process monitoring. "We are promoting this software tool to our process engineers, and encouraging them to use it for identifying the source of parametric performance variations."

Figure 3. At WIN Semiconductors, engineers noted that the DC yield variation of a customer product increased in the middle of February 2015. The problem was traced to a photolithography exposure tool, EQ5. It had a different chuck configuration to the other tools, and caused a different thin-film resistor critical dimension uniformity at the wafer edge for specific mask sets. 

Sources of the greatest variations are identified using F-statistics. This is a common statistical approach for qualifying a new tool. It involves comparing the results produced by their tool with those produced by a baseline tool. "However, the idea of ranking the F-statistics for all the interactive process steps originates from us," says Tsai. 

The F-statistics generate a figure-of-merit known as a p-value. The lower it is, the larger the variation. So, by listing p-values for various processes, engineers at WIN can quickly uncover the origin of the most significant process variations and address them. Tsai believes that the money saved by identifying and solving issues more quickly will have already outweighed that spent on developing the computer program.

One example of its capability, described by Tsai in his CS Man Tech paper, involved identifying the cause of an increase in the variation of sheet resistance of an epitaxial layer. Empowered by the program, engineers took less than five minutes to analyse 170 wafer lots fabricated over a six-month period. The software then enabled engineers to scrutinise more than 400 process steps, which were listed by p-value. This identified the culprit as a post-metal cleaning step. Armed with this insight, plots were produced for individual pieces of equipment, exposing "˜EQ1' as the troublesome tool. Eventually, engineers determined the root cause: dilution of the cleaning solution by residual deionised water, which came from wafers subjected to another process on the same cleaning tool.

Tsai's paper offers a second example of the benefits of the in-house program "“ uncovering the cause of a DC variation in a customer product, which deteriorated in the middle of February 2015. WIN's program generated p-values for several processes, with the lowest coming from a visual inspection that could not correlate to the yield drop. After this, the most promising candidate was the photolithography step for producing the thin-film transistor.

Plotting charts of individual tools for this step revealed that one of them, EQ5, produced a lower yield than the others. Engineers then identified the origin of this reduction in yield as a difference in chuck configuration. This caused a difference in thin-film resistor critical dimension uniformity at the wafer edge, but only for some mask sets. Note that this behaviour was not observed during the collection of DC yield data, for a qualification that involved a different chuck configuration.

The software has also helped to identify another issue, which was not described in Tsai's CS Man Tech paper. This recent issue is that an etching process for forming a via has produced several out-of-control events. 

"Etching photo engineers had checked their process tools, and all of them were normal," explains Tsai. So they then checked all the manufacturing steps with their computer program. This identified the problem as a measurement recipe associated with the scanning electron microscope. 

After restoring the recipe, the via etching process returned to normal. "Our engineers are still trying to find out the real root cause and prevent it from happening again," remarked Tsia.

Figure 4. This histogram shows the results of the same in-house measurement on about 100,000 parts produced by Qorvo

Although the program built by Tsia and his co-worker is powerful, it does have its limitations. A major one of these is that problems cannot be picked up immediately. That's because sufficient data has to be collected from different machine tools after the abnormal process has begun before enough data has been generated to create a low p-value. Another weakness is that the selections of initial lot lists can impact the p-value rankings. The consequence is that an engineer might have to investigate several steps with low p-values. 


Dealing with outliers

When devices are manufactured in high-volumes, it is inevitable that a few parts will have characteristics that are significantly different from the majority. Even though these outliers may still conform to specifications, their difference may indicate a defective part, or one that is less reliable.

That was the case for power amplifiers made by leading GaAs chipmaker TriQuint, which merged with RFMD in 2015 to form Qorvo. TriQuint used "˜part-average testing' to screen for outliers, and in December 2013 this led engineers to note an increase in leakage on a high-band input pin of a GSM/EDGE power amplifier. Further investigation revealed that the failures associated with this part-average testing methodology originated from a highly localised fab defect on the base contact of a diode on the high-band HBT die.

Qorvo's Thortsen Seager recalled this investigation in his CS Man Tech paper, which discussed methods to deal with outliers. Here, he also offered another compelling need to deal with outliers "“ if they are ignored, they can make a mockery of data analysis. For example, if they are included when generating an equation for a set of data, the best-fit line may fail to pass between the majority of points and compromise predictive powers. Another arguement for developing a good approach to dealing with outliers is associated with the ever-growing complexity of RF products. Ten years ago mobile phones had two or three bands, now they contain 40 or more, and 5G will lead to even more RF components.

"This represents major manufacturing challenges, since the yield of each component geometrically impacts the stacked yield of the module," says Seager, who adds that the outlier component wastes the cost of all the other components in the module. So, if efforts to increase integration are to be successful, they must be accompanied by tighter tolerances and a reduction in variation.

When looking at data, such as that shown in Figure 4, it is easy to spot the outliers. They can be defined as observations not connected to the main distribution. However, care must be taken, as there can be instances when many of the data points that are not connected to the main distribution are clumped together. When that happens, it is more appropriate to consider a second mode.

The data shown in Figure 4, produced by making the same measurement on roughly 100,000 parts, has a main distribution with a positive skew and four minor peaks of outliers. The two extremes are unimodal "“ that is, all the data is in a single bin. That could be due to clamping of the test system "“ it may have been programmed to a fixed range, or have a set minimum value and a set maximum value. Meanwhile, the two peaks immediately to the right of the main distribution could be due to a process or assembly issue.

Based on the results of this test, Seager and co-workers took the measurements on five products, ranging from PA modules to filters, and applied five statistical methods to them. They concluded that modified part-average-testing is the best method for dynamic screening at the package part test, while for other scenarios, such as data analysis, the adjusted boxplot offers a reliable method.

These findings, along with those highlighting the gains of sharing data between suppliers and chipmakers, and the benefits of superior methods for handling processing data, will help engineers at GaAs fabs to have greater mastery over their chipmaking. While improvements to running a fab may not receive as much attention as the development of superior chip designs and record-breaking device developments, they make a massive contribution towards profitable, high-volume manufacture of great GaAs products. 


×
Search the news archive

To close this popup you can press escape or click the close icon.
×
  • 1st January 1970
  • 1st January 1970
  • 1st January 1970
  • 1st January 1970
  • View all news 22645 more articles
Logo
×
Register - Step 1

You may choose to subscribe to the Compound Semiconductor Magazine, the Compound Semiconductor Newsletter, or both. You may also request additional information if required, before submitting your application.


Please subscribe me to:

 

You chose the industry type of "Other"

Please enter the industry that you work in:
Please enter the industry that you work in: