Lens Tolerances
(Originally published in the April 2007 Laser Focus World)
Lens Tolerances: Software Eliminates the Guesswork
by Donald C. Dilworth
Lens Tolerances: Software Eliminates the Guesswork
by Donald C. Dilworth
Some years ago I designed a small high-resolution projection lens. The tolerances were tight, and some elements had to be centered to within 10 microns. The drawings went to an offshore company, and when the lenses came back they did not work. The vendor had designed the cell in two parts that screwed together, but the threads were not concentric with the lenses. You could feel the decenter with your fingers in some cases. I reported the error to the shop, but they replied angrily, insulted that I had criticized their work. Another time, different lenses came back from a different shop and did not work properly. After two months of strenuous denial, the shop finally admitted that the parts were far out of spec.
While we do not expect lens manufacturers to be skilled mathematicians, they need to understand that there is a science behind the tolerances on the drawings. These examples show that some do not. They are not entirely to blame.
Before the development of modern lens design software, most designers assigned tolerances to lens dimensions following a rule-of-thumb procedure involving more guesswork than calculation. A few still do. Some designers calculate tolerances by the simple means of first getting inverse sensitivities, which give the amount by which each variable can be in error, with everything else perfect, and still meet the performance specifications. Then they just divide those numbers by the square root of the total number of variables, and that is the budget. This procedure is better than the rule-of-thumb approach, but still not rigorous by modern standards. That budget is proper only in cases where one would invariably find each parameter in error by exactly the assigned tolerance. Since lens parameters are most often somewhere within their range – but not exactly at the edge – a budget derived with that approach invariably assigns tolerances that are tighter than they need to be. For that reason, when a shop makes an element that exceeds the tolerances created in this way, and the lens works perfectly well, they are inclined to conclude that tolerances in general need not be closely adhered to – and this opinion can unfortunately be applied to other projects where the tolerances are more precise. This is a mistake.
Modern Practice
Modern tolerances, by contrast, calculated by a state-of-the art lens design program such as SYNOPSYSTM, are more sophisticated. This software first identifies all of the parameters that are to go into the budget and the quality descriptors that define the imaging goals. Then the rules of statistics tell us that, if we know the mean value and standard deviation of each parameter, we also know the mean value and standard deviation of each of the quality descriptors. The procedure is rigorous under certain conditions, and the output from the program is a table giving all of the tolerances. Thus, we know, for example, that if a lens were built to that table, then the Strehl ratio everywhere in the field would be higher than, say, 0.7, to a confidence level of 2, which means that we could confidently predict that of a large batch of lenses, 97.7% would have a larger Strehl ratio. Gone is the guesswork behind a lens tolerance budget.
A further refinement results from giving the software a definition of what constitutes a "tight" tolerance. Any tolerance that wants to be tighter than that will be loosened somewhat by the program, with most others getting tighter to compensate. This procedure reduces overall cost. Since the definition of loose and tight are more or less the same throughout the precision lens industry, even without close collaboration with the shop the designer can usually generate a low-cost budget rather well.
But we mentioned “certain conditions.” Here is where the vendor can help to meet the error budget – or derail the whole procedure. For the budget to be valid, the following must be true:
I am astonished by how many shops keep a poorly measured testplate list. To use such a list, one must ask for each testplate to be remeasured, then reoptimize the lens to the new values, if possible. If this does not work, then new testplates may be needed and the whole procedure repeated. This drives up design costs.
Yet another issue involves element thicknesses. Many shops aim for the high end of the tolerance budget so that if the part requires reworking the thickness is less likely to go under. This of course also violates the statistics. Here is an example: Fig. 1 shows the distribution function for a lens where the thicknesses errors are uniformly distributed. The desired Strehl Ratio is achieved quite well.
While we do not expect lens manufacturers to be skilled mathematicians, they need to understand that there is a science behind the tolerances on the drawings. These examples show that some do not. They are not entirely to blame.
Before the development of modern lens design software, most designers assigned tolerances to lens dimensions following a rule-of-thumb procedure involving more guesswork than calculation. A few still do. Some designers calculate tolerances by the simple means of first getting inverse sensitivities, which give the amount by which each variable can be in error, with everything else perfect, and still meet the performance specifications. Then they just divide those numbers by the square root of the total number of variables, and that is the budget. This procedure is better than the rule-of-thumb approach, but still not rigorous by modern standards. That budget is proper only in cases where one would invariably find each parameter in error by exactly the assigned tolerance. Since lens parameters are most often somewhere within their range – but not exactly at the edge – a budget derived with that approach invariably assigns tolerances that are tighter than they need to be. For that reason, when a shop makes an element that exceeds the tolerances created in this way, and the lens works perfectly well, they are inclined to conclude that tolerances in general need not be closely adhered to – and this opinion can unfortunately be applied to other projects where the tolerances are more precise. This is a mistake.
Modern Practice
Modern tolerances, by contrast, calculated by a state-of-the art lens design program such as SYNOPSYSTM, are more sophisticated. This software first identifies all of the parameters that are to go into the budget and the quality descriptors that define the imaging goals. Then the rules of statistics tell us that, if we know the mean value and standard deviation of each parameter, we also know the mean value and standard deviation of each of the quality descriptors. The procedure is rigorous under certain conditions, and the output from the program is a table giving all of the tolerances. Thus, we know, for example, that if a lens were built to that table, then the Strehl ratio everywhere in the field would be higher than, say, 0.7, to a confidence level of 2, which means that we could confidently predict that of a large batch of lenses, 97.7% would have a larger Strehl ratio. Gone is the guesswork behind a lens tolerance budget.
A further refinement results from giving the software a definition of what constitutes a "tight" tolerance. Any tolerance that wants to be tighter than that will be loosened somewhat by the program, with most others getting tighter to compensate. This procedure reduces overall cost. Since the definition of loose and tight are more or less the same throughout the precision lens industry, even without close collaboration with the shop the designer can usually generate a low-cost budget rather well.
But we mentioned “certain conditions.” Here is where the vendor can help to meet the error budget – or derail the whole procedure. For the budget to be valid, the following must be true:
- The mean value of each parameter must be at the design value.
- The parameter must be equally likely to be found anywhere within its error bracket.
I am astonished by how many shops keep a poorly measured testplate list. To use such a list, one must ask for each testplate to be remeasured, then reoptimize the lens to the new values, if possible. If this does not work, then new testplates may be needed and the whole procedure repeated. This drives up design costs.
Yet another issue involves element thicknesses. Many shops aim for the high end of the tolerance budget so that if the part requires reworking the thickness is less likely to go under. This of course also violates the statistics. Here is an example: Fig. 1 shows the distribution function for a lens where the thicknesses errors are uniformly distributed. The desired Strehl Ratio is achieved quite well.
Figure 1. Statistical distribution function with uniform error statistics
Compare that to the curves in Fig. 2; in this case, all thicknesses were at the high end of their range, and the yield is significantly poorer – even though, on paper, all tolerances were met.
Compare that to the curves in Fig. 2; in this case, all thicknesses were at the high end of their range, and the yield is significantly poorer – even though, on paper, all tolerances were met.
Figure 2. Statistical distribution function when thicknesses are at the high end of the tolerance range
In this case, one could better obtain the desired results by recognizing that the thicknesses conform to different statistics. One wants the standard deviation of the thickness to be the same as with the uniform distribution, and this will occur if one divides the thickness tolerances by the square root of three for this case. Then the expected value of the thickness error will be the same as for the uniform distribution, and the yield improves. It is a safe bet that few shops understand this kind of subtlety.
Success in optics requires close collaboration with the vendor, who must understand the significance of the issues presented here. If the designer can connect with the shop to this degree, there will be fewer instances of lenses that do not work as they should. It would be helpful if designers would spell out these considerations when approaching a vendor, thereby insuring that ignorance would no longer be an excuse for the failure of a project, as it has in the past.
Donald C. Dilworth
President, Optical Systems Design, Inc.
In this case, one could better obtain the desired results by recognizing that the thicknesses conform to different statistics. One wants the standard deviation of the thickness to be the same as with the uniform distribution, and this will occur if one divides the thickness tolerances by the square root of three for this case. Then the expected value of the thickness error will be the same as for the uniform distribution, and the yield improves. It is a safe bet that few shops understand this kind of subtlety.
Success in optics requires close collaboration with the vendor, who must understand the significance of the issues presented here. If the designer can connect with the shop to this degree, there will be fewer instances of lenses that do not work as they should. It would be helpful if designers would spell out these considerations when approaching a vendor, thereby insuring that ignorance would no longer be an excuse for the failure of a project, as it has in the past.
Donald C. Dilworth
President, Optical Systems Design, Inc.