A Short History on the Origins of Industrial Reliability

Reliability as a study is closely linked to the development of industrial technology. To historically delineate this is somewhat difficult, but is closely related to the development of interchangeable parts. With interchangeable parts, items are design to be uniform in their characteristics and function. In contrast to earlier means of production done by artisans and craftspeople, interchangeable parts are produced by tools that are designed to be consistent. The assurance of consistency has provided for the development of statistical methodologies to allow for statistical inference of reliability.[1]

This idea of interchangeable parts goes back millennia.[2] However, the contemporaneous idea began in the development of military components in Europe during the late 18th and early 19th century.[3] Perhaps by cyclic definition, the idea of uniform weaponry itself was develop to improve the reliability of the military as a whole. Prior to the development of the interchangeable parts, weapons were bespoke and had custom components made by the gunsmith or blacksmith. A failure of the weapon in combat would effectively render the entire weapon useless and would not be repairable. Even worse, bespoke weapons had unique ammunition that was often not interchangeable.

This idea was brought to America by ambassadors returning from Europe and used by Eli Whitney in the production of weapons for the United States Military.[4] These early muskets began the standardization process for military procurement, and interchangeable parts were demanded by militaries due to their improvement in operations reliability. In parallel with the development of interchangeable parts came several other accounting and production methods. The two most notable are cost accounting and the American Production system.[5] Cost accounting is a mechanism for understanding production costs such that a firm may charge an appropriate amount for their goods. In earlier methods of production, this was exceedingly difficult to quantify, but made tractable with uniform production. Similarly, interchangeable parts helped start the American system of production. In this system, tolerances and dimensions are specified such that the skill for assembly and component parts remain low.[6] Similarly, there is a large emphasis on mechanized production such that output is consistent. With these ideas, reliability and quality began to grow as an idea. In 1918, the American National Standards Institute (ANSI) was formed to help provide common frameworks for the various engineering disciplines that had begun to develop around industry. Some of the earliest standards were focused on measure and dimensioning to provide consistent readings and communication between firms.[7]

Reliability engineering and Quality Engineering began to be formalized around this time with the work of Walter Shewhart, W. Edwards Deming, and Joseph Juran, all of Bell Labs.[8] With the advent of interchangeable parts, there is a sufficiently low degree of variation within each component, such that the performance characteristics can be effectively measured across a wide population. For statistical purposes, this allows the reliability and quality characteristics to be inferred and controlled using techniques such as statistical process control (SPC).[9] In this sense, reliability and quality are two different areas, but somewhat linked. Quality is the degree of fitness for use, or excellence, in a product. Reliability is the change of quality over time, both in an instantaneous and integral sense.[10] Each respective study has been enabled by the consistency of modern industry and accelerated by the development of computing and sensors.

Modern Reliability engineering takes many forms and spans all types of hardware and software systems. Within the software domain, reliability is of paramount concern that has seen significant improvement in the recent decades. In part, modern systems have out-scaled a single system, so large companies rely on several inter-networked computers to support operations. This issue of scaling across computer resources is an important area of computer research today, at varying degrees of similitude from networks to multi-threaded processors.[11] Within the space of database engineering, ACID, or Atomic, Consistence, Isolated, and Durable services are an area of intense research with a formalization of the process in 1983.[12] Similarly, computing languages have seen extensive development themselves in the field of reliability with several standards for testing, code coverage, and formal verification in use for systems where reliability is of utmost important. With formal verification systems, a program may be checked for correctness using a second program to interpret the flow of logic in the program to ensure data types and structures are sufficiently correct.

Modern reliability engineering in hardware extensively uses simulation tools and testing to support results. When coupled with large scale computing systems, processes are more tightly controlled and diagnosed. This has reach traditional engineering practice with tools such as Product Lifecycle Management (PLM) software, which tracks each component of a design in digital form. Most recently there has been interest in the area of digital twins for products, where production data is stored about each product on top of this PLM data.[13] The idea here is to enable improved quality and reliability by overlying real-world measurements into software to infer quality and reliability characteristics.

Overall, the study of reliability engineering is linked to the ability to infer the success of a given process based on past results. Prior to the industrial revolution, bespoke manufacturing provided little means for studying the interaction of materials, process, and use. However standardization has allowed for systems to have a certain degree of consistency which has allowed for the use of statistical analysis. This statistical analysis is then used across systems and at varying levels of specificity to study and improve a given system. Overall, reliability engineering touches all levels of engineering and is a factor of growing importance. Moreover, reliability is and important internal and external factor in the development of products and the processes that make them.

References

[1] “History of Reliability Engineering,” American Society for Quality – Reliability and Risk Division. .

[2] “ALRI AncRomUnit4 Images.” [Online]. Available: http://www.mmdtkw.org/ALRIAncRomUnit4Slides.html.

[3] C. C. Gillispie, “Science and Secret Weapons Development in Revolutionary France, 1792-1804: A Documentary History,” Hist. Stud. Phys. Biol. Sci., vol. 23, no. 1, pp. 35–152, 1992.

[4] R. S. Woodbury, “The Legend of Eli Whitney and Interchangeable Parts,” Technol. Cult., vol. 1, no. 3, pp. 235–253, 1960.

[5] S. J. Hu, “Evolving Paradigms of Manufacturing: From Mass Production to Mass Customization and Personalization,” Procedia CIRP, vol. 7, pp. 3–8, 2013.

[6] T. Ohno, “Tovota Production System: Beyond Large-Scale Production,” 1978. [Online]. Available: https://elsmar.com/Cove_Premium/Toyota%20Production%20System/ToyotaProdSys_Paper.pdf.

[7] “ANSI: Celebrating 100 Years: 1918 – 2018.” [Online]. Available: https://www.ansi.org/about_ansi/introduction/history. [Accessed: 08-Dec-2019].

[8] “Western Electric History.” [Online]. Available: https://www.webcitation.org/5wDkkOkcj?url=http://www.porticus.org/bell/westernelectric_history.html.

[9] “Remembering Joseph Juran And His Lasting Impact on Quality Improvement,” Six Sigma Daily, 28-Feb-2018. .

[10] “8.1.1.1. Quality versus reliability.” [Online]. Available: https://www.itl.nist.gov/div898/handbook/apr/section1/apr111.htm.

[11] K. A. Zimmermann, J. E. June 27, and 2017 Tech, “Internet History Timeline: ARPANET to the World Wide Web,” livescience.com. [Online]. Available: https://www.livescience.com/20727-internet-history.html.

[12] T. Wang, J. Vonk, B. Kratz, and P. Grefen, “A survey on the history of transaction management: from flat to grid transactions,” Distrib. Parallel Databases, vol. 23, no. 3, pp. 235–270, Jun. 2008.

[13] “The future for industrial services: the digital twin,” p. 8.

Cube 3D RepRap Conversion

The Cube 3D was released by 3D Systems in 2014. It is a nice printer but was mostly locked-in to their ecosystem. Since support and cartridge production has mostly ceased these can be found on eBay for <200 USD. I saw some posts on Reddit and Thingiverse about converting these to open-source RepRaps, and remembered that these have quite nice motion systems. Some nice features:

  • Die-cast frame, nicely milled for squareness
  • Linear ball-bearing rails
  • GT2 belts with 14 tooth pulleys on X,Y,Z
  • Lots of holes for mounting components

I patiently watched eBay over the summer and sniped two Cubes for $80 and $120. So $100 for stepper motors, linear rails, pulleys, belts, and a frame. Quite a good deal in my book. I started the build in July, but due to grad school I only finished in December. Fortunately my work-horse Mendel90 is still alive and kicking, and was able to print the parts for the Cube build. The Mendel90 is a great design, however mine has been through four moves in college, travel to-and-from schools, and left for dead in a sub-freezing garage during NY winters. But, it still prints pretty darn good for being almost 5 years old, albeit with poor tolerance. I wanted something a little smaller, more robust, and with the latest features. I came up with the following feature list for the build:

  • Trinamic drivers
  • Auto-leveling bed
  • All metal construction
  • Self contained (PSU + Filament spool onboard)
  • All Metal Hot-side
  • Total Cost <$600

As a personal challenge, I did most of the design in Fusion 360, and some parts in FreeCAD. I would normally do something like this in OpenSCAD, but I wanted to learn more traditional design tools. Fusion 360 now runs on Linux with Lutris, which is quite nice and does not need a Windows VM. I could have gone out an bought a Prusa Mk3 for a little more. However, I think this printer is more compact, accurate, and keeps the RepRap lineage going:

Bill of Materials

The high-level Bill of Materials is quite good. I had pretty much all the hardware and connectors on hand to finish it up. Overall the big ticket items added up to just under 400 USD. Adding in all the extras I had on hand, I am probably still under 500 USD for the build.

Part SourceCost Link
Cube 3DeBay$100
Meanwell
350W PSU
amazon$30https://www.amazon.com/gp/product/B013ETVO12/ref=ppx_yo_dt_b_search_asin_title?ie=UTF8&psc=1
Einsy
Rambo 1.1
ultimachine$110https://ultimachine.com/products/einsy-rambo-1-1
E3D V6
(Bowden, 24V)
e3d$75https://e3d-online.com/v6-all-metal-hotend
Titan Extrudere3d$78https://e3d-online.com/titan-extruder





Approx. Total:$393

Marlin Config:

https://github.com/sjkelly/Marlin/tree/sjk/cube3d_1x

Github Repo:

https://github.com/sjkelly/cube3d-reprapped

State of the Descartes 2019-12-22

Since my last post I have ended in somewhat of a rabbit hole. A good one at that. My original intentions and goals were performance and feature oriented. The past few months I have been taking a more foundational approach to implicit solid modeling in Julia, with a focus on meshing.

As part of my master’s capstone project I worked on faster meshing of both triangular surfaces, and tetrahedral solids. These are updates to Meshing.jl and the new DistMesh.jl. I will give a brief overview of DistMesh here, and will go into further discussion in a later blog post.

DistMesh.jl

DistMesh was first described by Per-Olof Persson in his thesis at MIT. It is a mechanism for using the underlying signed distance function to generate high quality tetrahedral meshes. High quality in this sense means the solid-filling tetrahedra are as close to regular as possible. This provides good integration characteristics in Finite Element Analysis. Originally I was trying to using Tetgen to generate a tetrahedralization of a mesh from Meshing, but this was far too slow since a mesh from isosurface extraction generates too many triangles (100k-1M, in my case) to run efficiently on a consumer PC.  DistMesh.jl is still in development, and there a few more experiments and improvements I want to make before discussing more. However it uses Tetgen for Delaunay triangulation and runs the refinements in Julia. I leave you with a few nice histograms showing the qualities:

dihedral

Meshing.jl

Prior to mid-summer my general thinking for meshing was focused on accuracy for 3D Printing (Dual Contours), and adaptivity. These are both still goals, and I have made some proof-of-concepts for each. However, I decided to focus on some low-hanging fruit in the uniform sampling approaches. In particular I made some API improvements to allow for direct function sampling. Some of this is discussed on the Julia Discourse. This approach is much more memory efficient and for simple geometries the performance is quite good. There is also nice scaling with thread counts. I suspect the performance cross-over to be quite high because Julia is able to generate good SIMD vectorized code.

gyroid

Descartes.jl

There are now some basic docs! The only thing written up so far is a basic pipeline. I still need to add docstrings for the primitives and operations.

I also add support for 2D and extrusion. An example is here.

Future

There is still a lot to do on each project, but each seems to be maturing at a good clip. In January I will be starting to research topology optimization at Stony Brook and take classes. This is likely to consume a lot of my time. I am trying to focus on a few objectives over the break:

  1. Shake the bugs out of Meshing and get a multithreaded API ready.
  2. Prepare an initial release of DistMesh
  3. Make a DescartesLive.jl library with some sort of reload-preview workflow like OpenSCAD.

 

How Ecological are 3D Printers?

Over the past few years I have been to a few presentations by HP on their Jet Fusion technology. By most metrics, Jet Fusion is the most-efficient and highest quality additive manufacturing technology. However, the current systems cost over 100k USD. This means for small-run and distributed manufacturing, FDM additive manufacturing is probably the best option. I’ve recently been interested studying the economics of small-scale and localized 3D Printing.

This first post is a study of the ecology of 3D printing, primarily focusing on FDM and net CO2 emissions. In a second post I will discuss the engineering and bottom-line efficiency of different 3D printing technologies.

FDM and the Environment

The RepRap project, aside from being an interesting project in the application of open source hardware and distributed manufacturing, has its roots in environmentalism. The team saw local production as a mechanism for reducing dependence of supply chains and industrial manufacturing. Early in the project the team evaluated several different materials for 3D printing such as ABS, Polypropylene (PP), Polycaprolactone, and Poly Lactic Acid (PLA). It was discovered by the RepRap project that PLA was an ideal material for printing dimensionally accurate objects and met the environmental objectives of the project as well. In particular, PLA is synthesized from food starch and without the use of any fossil fuels. This allowed the team to explore local synthesis PLA from locally-sourced agricultural by-products, furthering the objective of distributed production. Similarly, it has been discovered in 2009 by a team at the Korean Advanced Institute of Science and Technology that PLA can be synthesized using genetically engineered bacteria, further reducing the dependence on industrial manufacturing processes. In 2016 it was announced that the French company CARBIOS is bringing this process to commercial production to address the 15% annual increase in demand for PLA. In the graph below, we can see that the carbon emissions associated with PLA are roughly 1/3 to 1/10 of the comparable emissions of traditional plastics. For all bioplastics the demand has grown from 0.9 million tons in 2009 to 5.3 million tons in 2019.

pla_c02.png

Complexity and 3D Printing Efficiency

A 2013 study by Michigan Technical University by Pearce, et. al. showed the environmental lifecycle of 3D printing at the small scale. In this study they compared injection molding with a sub $1000 RepRap 3D printer. In this study the team evaluated a baseline solid block, waterspout, and fruit juicer for overall energy demands of production and transportation in the case of injection molding over 3000 miles. The team showed that for low-complexity components, the RepRap 3D printer was an inferior manufacturing method regarding emissions, unless the fill density of the component was reduced. However, for a high complexity component, such as the waterspout, the emissions from 3D printing were a fraction of those found in conventional manufacturing. In this way, complexity in 3D printing is low-cost and allows for efficient manufacturing. The team also noted that localized manufacturing allows for the control of energy sources, and it is much easier to run a local 3D printing operation off a solar panel than it is to run an industrial scale injection molding facility and supporting logistics system off green energy. In the case of using 3D printing with photovoltaics, the emissions were roughly 29% less in the case of the block and 84% less in the waterspout. In the figure below we can see the varying fill rates of the block. In the chart, the energy demands of the block and waterspout, respectively, are compared to injection molding with varied infill rates and energy sources.

pearce_energy

energy_3dp_pearce

Recycling with 3D Printing

Recycling is a critical are where 3D printing can help mitigate greenhouse gas emissions. Traditionally, recycled plastic is difficult to sort at large volumes. This leads to most of the plastic collected by municipalities for the purpose of recycling being sent to landfills and incinerators. A study at the University of Georgia estimated that only 9% of global plastic production is recycled, 14% incinerated, and 79% landfilled or littered. In the same study, this problem is attributed to the relatively short service life of plastic goods used from things such as disposable cutlery, packaging, and electronics. However, one of the benefits of thermoplastic, is that it may be reformed several times. Similarly, plastic production is growing rapidly. Despite this tremendous growth, the Royal Society of London estimates that only 4% of global petroleum demand is used for the production of plastics. A study of greenhouse gas emissions in the production of plastic drain covers showed that the emissions are 36% lower when using recycled material.

One of the most common concerns of recycled plastic is that it no longer retains its material properties. Since plastic is a polymer, it is made of long chains of monomers which are attached together. Over several thermal cycles these polymer chains begin to break apart and degrade. Several studies have shown that successive thermal cycles of plastic can degrade the tensile strength roughly 530%. This variation in recycled plastic properties makes it difficult to account for the final mechanical strength of a component. However, for bespoke manufacturing where the mechanical properties can be tested, this make recycling somewhat simpler. Recycling provides somewhat marginal benefits in the overall greenhouse gas emissions associated with production and consumption. The most immediate concerns are more directly related to the pollution of the goods themselves, rather than the energy consumption of their production. However, 3D printing can provide a mechanism for alleviating some of the above by shortening supply chains and allocating materials by their required mechanical properties.

drain_caps_co2

 

State of the Descartes 2019-06-09

In my last post about Descartes I outlined some overall objectives of the development  going forward:

  • OpenSCAD near-feature parity
  • Forward differentiable representations
  • Meshing approaches for sharp corners (e.g. Dual Contours)
  • Oct-Tree sampling approaches (and interpolation of the functional representation)

All of the above objectives have been touched upon since my last post. OpenSCAD feature parity continues to progress and there have been some additions for 2D components. The issue tracking this progress is here.

Now the last three elements are all somewhat related. I have made good progress on adjusting the functional representation API to use vector formats rather than an argument for each cardinal value. This allowed me to use the very good ForwardDiff.jl library and port Python examples for Dual Contours to Julia. I am continuing to work on the Oct-Tree system with some patches to AdaptiveDistanceFields.jl. These are all pretty closely related projects and I hope to have a system that fully utilizes ADF, Dual Contours, and differentiable representations by the end of the summer. My hope is that this will provide a lead-in to other experiments such as mesh-free methods for engineering analysis in the future. In the more immediate outlook I will continue to refine these elements so there is the generation of good-quality solid models using Julia.

Screenshot from 2019-06-09 20-58-10

In addition, I have been working on a research project for the Master’s that will use Descartes and Julia’s FEniCS bindings for parametric design studies. I spent quite a bit of time working on updating the build process and modernizing the FEniCS.jl package for the latest versions of FEniCS and PyCall.jl. There is a pull request I opened that improves the default install by using the Conda infrastructure.

Overall, the general direction I described in late April has progressed. I feel with the pending implementation of dual contours, there should be a 0.0.1 release in the coming months.

For the 0.0.1 release the following are necessary:

  • 80% Feature parity with OpenSCAD
  • Dual Contours implementation
  • FEniCS integration
  • Documentation

 

 

 

 

 

State of the Descartes: 2019-04-20

csg

The biggest accomplishment recently is reworking the Descartes system to run on Julia 1.0. I last programmed heavily on 0.5 and made sparse updates to my packages for 0.6, but never fully acclimated to the differences. Going to 1.0 was a big change and I made it more difficult by not using 0.7 to get the appropriate deprecation warnings. The new package manager is definitely confusing at first, but it is way more powerful and I am finally understanding the structure of its operation.

I recently added OpenCL support for distance field computations, which speeds things up significantly. The current implementation could use some improvements, and I am learning OpenCL as I go. My current challenge is keeping OpenCL and CPU backends in sync. There is currently a lack of tests so I will need to write some to make sure both are tested thoroughly. Simon Danisch has a Transpiler.jl package that I may investigate in the future. I opted to use OpenCL since I want cross-platform support and I think in the long run SPIRV backend will improve in LLVM so a lot of the integration CUDA/PTX has with Julia be similar for OpenCL/SPIRV. I also interviewed an HPC guy for work and he mentioned AMD is the best value right now, so SPIRV should be getting some additional interest.

There is also now a visualization routine that is nicely integrated with MeshCat.jl. This makes it really easy to pull up some code in Atom and run it and revisualize quickly. The current workflow should be somewhat useful for making solid models in a similar workflow to OpenSCAD. Once I get my 3D Printers running again I will start designing some test parts.  I ostensibly want to make the proof-of-concept design using Descartes a parametric 3D printer. This may take a year to get to that point, but it should show the possibilities of turbo-charged CAD in a programming language. I have some ideas for inverting the Z axis on my RepRap Lewis design that may be worth looking at. Plus this will allow be to explore some of the robotics tools in Julia. I haven’t given it much thought, bit I want to plow the farm before I plant.

In general I am working toward four somewhat similar goals in the next months:

  • OpenSCAD near-feature parity
  • Forward differentiable representations
  • Meshing approaches for sharp corners (e.g. Dual Contours)
  • Oct-Tree sampling approaches (and interpolation of the functional representation)

I may have slipped a fifth one in there. The first is almost complete, aside from 2D geometry and a few operations. Convex hulls and Minkowski sums open issues since I can’t seem to find any prior work on accomplishing these on implicits. The last three objectives are kind of one in the same. Forward differential implicits help get to dual contours. Oct-tree sampling and interpolations help keep this fast.

The final thing that is probably most exciting is that I have found an advisor for a project to integrate FEA with Descartes. He is an expert in mesh generation and FEA so I am looking forward to working together to make a simple interface for engineering analysis. We suspect that the application of distance fields can improve some aspects of cubic and tetrahedral meshes. I have pretty much structured the prior objectives to provide the best-possible interface for achieving these goals.

 

 

How to generate a package for Julia 1.1

This is part of the “Relearning Julia Series”…

In short, use PkgTemplates.jl.

PkgTemplates now provides the old Pkg/PkgDev.generate function.

Background

Since 0.6 several things have changed in the Julia ecosystem. One in particular is the package manager. There is a very powerful project and environment system now available to all users, which makes the system much stronger in enterprise deployments. However, the underlying system for publishing packages has not changed very much. There is still METADATA.jl.

In 0.5 one may have just run something like:
Pkg.generate("MyPackage, "MIT")
In 0.6 I believe PkgDev was introduced to help provide some additional features for publishing packages to METADATA, and the generate function was pulled out of base. Prior to PkgDev one would need to tag releases and manually submit pull requests.

Later attobot was introduced to automatically generate METADATA PRs once there was a tagged release.

PkgDev unfortunately does not seem to work on Julia 0.7 and greater, so this old functionality is broken. Therefore PkgTemplates seems to be the new preferred method of generate new projects.

Printrbot Part 3 : Bowden Setup

I totally forgot to write about my bowden setup. I will keep this brief, because it is very simple. Below are pictures of the parts with comments.

Above is the bowden coupler. I parameterized it better and simplified the script over the original. The new script allows me to take into account offsets in the hole and filament output. I used a pencil sharpener to slightly taper the holes so I could thread on the nuts. I am using 6mmOD 4mmID PTFE tubing. I found I just needed to get enough leverage on the nuts and they threaded relatively easily.  The part can be found here… http://www.thingiverse.com/thing:15223
I am using Stoffel15’s 9/47 wade’s extruder with a Nema 17 stepper. I just wanted to get more torque on the filament, no other particular reason for choosing this cold end. Parts are here…http://www.thingiverse.com/thing:4964
Here you can see the x-carriage. I put a piece of PTFE in the gap so the filament doesn’t collapse from the pressure.

Printrbot Part 2

I have been taking my time putting it together, slowly assembling. I think in my old age(18) I’ve started becomeing more meticulous :p. Anyway, I am pretty much completed with the build. Over all goal were met. It is transportable, and very light. It should serve as a good hacking platform. It is going to be located in the WPI Collablab, which is the new on campus hackerspace a couple of freshman started. We might be doing weekly RepRap classes to get a few more in the space.

The hot end, from makergear. I ended up making a printed part to go below the wood to strengthen it. It was bending and pushing the insulator out of the groove.
What 6 feet of nichrome looks like.
Current is slightly higher than needed. (This is at 18v, running 35v on the printer) Seems to be ok on the printer so far, temp swing is positive 5C off target.
Dorm Room RepRap FTW. When I had my Issac here it occupied half my lower desk space, which is massively inconvenient when I wanted to spread out to study.

Overall issues to fix are the temperature swing(lower the duty cycle), get better X and Y steppers that don’t skip steps, and refine the bowden system. I ordered http://www.pololu.com/catalog/product/1209 which should have 25% more torque. They are back ordered, so I might be waiting a while. In the mean time I put the feet on blocks and added bigger fans and heatsinks to the motors, along with making the pulleys more heat friendly(http://www.thingiverse.com/thing:15566). I put the whole thing on a piece of MDF and zip tied everything down. It makes it really nice to transport. During printing I noticed that the extruder squirted plastic. The imperfections are hard to see in the prints because of the X and Y skipping steps. Once the motors get here I will try and fine tune the retraction or a get a smaller ID cable (I am using 4mm now). 35V seems to work well overall too. Props to whosawhatsis again. I am going to set this aside for a while until I get the new motors, and work on some other projects. Overall, with a bowden setup I feel this could be the hottest RepRap right now.