🏆
International Academic Publisher
Serving Researchers Since 2012

A Structured Framework for Implementing Design of Experiments (DOE) in an R&D Organization

DOI : https://doi.org/10.5281/zenodo.19760699
Download Full-Text PDF Cite this Publication

Text Only Version

A Structured Framework for Implementing Design of Experiments (DOE) in an R&D Organization

Saleh Bamakhshab

Arizona State University

AbstractThis paper presents the development and implementation of a standardized Design of Experiments (DOE) framework for an R&D organization specializing in customized electromechanical systems. Before this work, the organization exhibited inconsistent and informal experimental practices across departments, resulting in unreliable data, subjective decision- making, and project inefficiencies. Through systematic assessment of existing workflows, discussions with department heads, and evaluation of industry best practices, a unified DOE process and an experiment request form were developed to improve the planning, execution, and documentation of experiments. A pilot study using historical experimental data was restructured using the proposed framework to validate its effectiveness. The results demonstrate that the standardized DOE system enhances experimental clarity, reduces variability, and supports data- driven engineering decisions. The framework has been approved by organizational leadership and is now integrated into operational processes, contributing to improved R&D efficiency and better project outcomes.

Keywords: DOE, ANOVA, R&D Process Improvement, Experimental Standardization, Full Factorial Design, Experiment Request Form,

  1. INTRODUCTION

    The organization is a growing research and development (R&D) entity in Saudi Arabia, specializing in customized electromechanical systems such as digital displays, uninterruptible power supplies (UPS), and other engineered solutions tailored to industrial and commercial applications. As an R&D-driven environment, experimentation plays a central role in validating design concepts, improving product reliability, optimizing performance, and supporting innovation across departments. Engineering teams frequently engage in testing new prototypes, evaluating material selections, analyzing failure modes, and verifying product specifications to ensure the delivery of high-quality solutions to clients.

    Despite the importance of experimentation, the organization currently lacks a cohesive and scientifically grounded framework for planning, conducting, and documenting experimental work. Experimental activities are typically initiated based on informal decision-making, personal experience, or urgent project needs rather than a structured methodology. Each departmentsoftware, hardware, mechanical, and manufacturingfollows its own internal practices for defining objectives, selecting variables, collecting measurements, and analyzing outcomes. This experimental design can lead to unreliable

    or non-reproducible data, making it difficult for engineers to draw valid conclusions or justify design decisions. Unclear experimental objectives and poorly defined measurement techniques increase the risk of misinterpretation, contributing to project delays, rework, and cost overruns. Furthermore, without a formal system for evaluating experimental results, decision-making often becomes subjective, relying heavily on individual judgment rather than evidence-based analysis.

    To address these challenges, this study aims to evaluate the organizations existing experimental workflows and develop a unified framework for the Design of Experiments (DOE) tailored to its engineering environment. DOE, a structured statistical approach used across many scientific and industrial fields, provides a robust methodology for identifying key factors, optimizing processes, and minimizing variability. By adopting DOE principles, the organization can improve the reliability of its experiments, create repeatable procedures, and ensure consistent documentation across departments.

    The development of the proposed framework included a detailed assessment of current practices through interviews and meetings with department heads, analysis of existing documentation, and identification of gaps in experimental planning and execution. Based on these insights, a standardized DOE workflow and an experiment request form were formulated to ensure that essential elementssuch as hypotheses, factors, assumptions, constraints, and evaluation metricsare clearly defined before initiating any experiment.

    In addition to internal assessment, this study also reviews DOE implementations in relevant industries, highlighting best practices and methodologies that can strengthen the proposed system. By integrating industry-validated DOE concepts with the organizations operational needs, the resulting framework aims to enhance experimental clarity, reduce variability, and support more efficient and data-driven engineering processes. Ultimately, the standardized DOE system is expected to contribute to improved product development, streamlined workflows, and stronger overall R&D performance.

  2. RELATED WORK

    Several studies in the literature emphasize the importance of the Design of Experiments (DOE) as a structured and statistically grounded methodology for enhancing product development, optimizing engineering processes, and supporting evidence-based decision-making. Durakovi provides a comprehensive overview of DOE principles and industrial applications, highlighting its role in

    establishing clear cause-and-effect relationships and improving product and process parameters using statistically validated techniques. DOE is widely adopted across scientific, manufacturing, and engineering domains because it enables organizations to reduce experimentation costs while generating reliable insights from a limited number of trials.

    Grice and Montgomery further expand on DOE as a systematic framework composed of several key steps: defining experimental objectives, selecting factors and levels, choosing an appropriate design structure, and performing statistical analysis to interpret results. Their work underscores the importance of selecting a designsuch as screening, characterization, or optimizationbased on the specific research goals, constraints, and available resources. However, choosing the most efficient model among competing alternatives remains challenging, particularly in complex engineering environments where numerous interacting variables can obscure meaningful patterns.

    Jankovi, Chaudhary, and Goia investigate the impact of various DOE formulations on the characterization of complex multidomain systems. Their findings show that factorial, fractional factorial, and Taguchi designs each offer different levels of insight, depending on the nature of interactions among variables. These studies highlight that the selection of an appropriate DOE strategy must strike a balance between informational depth and practical feasibilityan essential consideration for R&D environments where experiments often span multiple engineering disciplines.

    Integration of DOE into broader organizational and project management processes has also been extensively explored. Chao and Ishii emphasize the benefits of incorporating structured analytical tools, such as DOE and Quality Function Deployment (QFD), into early-stage project planning. Their research demonstrates that embedding DOE into project workflows improves prioritization, reduces technical risk, and supports better management of constraints such as time-to-market and financial limitations. Similarly, Souza and Paolo argue that applying DOE continuously throughout the development cyclerather than treating it as a one-time activitycan significantly reduce development time while improving final product obustness and performance consistency.

    In industrial sectors, DOE has proven particularly effective for process and product optimization. Dennison et al. applied DOE to optimize pharmaceutical coating processes, demonstrating how systematic parameter variation can reveal the influence of critical factors on droplet size and coating uniformity. Their work highlights DOEs capability to manage complex variable interactions and generate predictive models with high accuracy. In the field of building engineering, DOE has been used to evaluate and optimize thermal performance in double-skin façades through Energy Plus simulations, demonstrating effectiveness in reducing energy consumption and improving envelope efficiency.

    Collectively, these studies demonstrate the versatility and values of DOE across diverse engineering and scientific domains. DOE supports improved process reliability, minimizes variability, enhances product performance, and strengthens data-driven decision-making. Despite these benefits, many organizationsparticularly those experiencing rapid growth or limited methodological standardizationstruggle to implement DOE consistently. Challenges often arise due to fragmented documentation practices, inconsistent experimental workflows, and a

    lack of unified procedures across departments. This gap in practical implementation highlights the need for a structured, organization-wide DOE framework tailored to environments where diverse engineering teams conduct frequent, resource-intensive experiments.

  3. PROPOSED SYSTEM/METHODOLOGY

    1. Methodology Overview

      The proposed methodology establishes a unified, repeatable, and scientifically grounded approach for planning, executing, analyzing, and documenting experiments across all technical departments within the organization. The system integrates industry-standard Design of Experiments (DOE) principles with the organizations internal structure, resource constraints, and project management lifecycle. The overarching goal is to eliminate inconsistent experimental practices, reduce human and procedural errors, improve data credibility, and accelerate decision-making.

      The methodology is built on four foundational pillars: Standardization A unified process flow, shared templates, and consistent documentation used across all engineering teams.

      Scientific Rigor All experiments must follow DOE principles, proper statistical validation, and reproducible procedures.

      Cross-Department Integration Clear distinction between requester and executor roles to avoid confusion and ensure accountability.

      Governance & Continuous Improvement Oversight by PMO/leadership and ongoing refinement of procedures based on lessons learned.

    2. Proposed System Architecture

      1. Centralized Experiment Lifecycle Framework

        All experiments conducted within the organization will follow a six-phase standardized lifecycle:

        1. Initiation & Problem Definition

        2. Experimental Planning (DOE Design)

        3. Approval & Resource Allocation

        4. Experiment Execution

        5. Data Analysis & Interpretation

        6. Reporting, Archiving & Continuous Improvement Each phase includes clearly defined inputs, outputs, stakeholders, quality gates, and documentation requirements to ensure transparency and repeatability.

    3. Detailed Methodology Steps

      1. Phase 1: Initiation & Problem Definition

        Owner: Experiment Requester (Technical Lead, System Engineer, Hardware/Software Engineer)

        Key Activities:

        Define the technical problem with measurable criteria and engineering relevance.

        Capture experiment objectives (screening, optimization, verification, comparison).

        Identify constraints, boundary conditions, and assumptions. Review previous experiments to prevent duplication and reuse applicable insights.

        Deliverable:

        Completed Experiment Request Form (Sections 14): Problem Definition, Objectives, Inputs/Outputs, Assumptions

      2. Phase 2: DOE-Based Experimental Planning Owner:

        Requester + Experiment Executor

        Reviewer: SE/PMO/Technical Reviewer

        Activities:

        Select the appropriate DOE model: OFAT, full factorial, fractional factorial, Taguchi, screening, or response- surface design.

        Identify:

        • Inputs (factors)

        • Levels

        • Controllable factors

        • Nuisance (uncontrollable) factors

        Define measurement methods: instrumentation, sampling rate, calibration criteria.

        Prepare process sequence, environmental conditions, and safety/risk requirements.

        Tools:

        • DOE software (e.g., JMP, Minitab, or equivalent)

        • Calibration sheets

        • Risk assessment checklists

          Deliverable:

        • Finalized DOE design section in the Request Form

        • Draft Experimental Procedure

          Quality Gate:

          • SE/PMO validates statistical soundness, factor selection, and feasibility.

      3. Phase 3: Approval & Resource Allocation Owner: Department Manager + PMO Approver: CTO Activities:

        Validate resource availability:

        Facilities (DFMA heating chamber, power sources, etc.) Manpower

        Required software tools Budget

        Confirm execution team assignments. Approve the experiment for execution. Deliverables:

        Fully approved Experiment Request Form Resource allocation plan

      4. Phase 4: Experiment Execution

        Owner: Executing Department (e.g., DFMA, Software, Mechanical, Hardware)

        Activities:

        Conduct experiment as per standardized procedure. Ensure: Randomization Replication Control of nuisance factors

        Deliverable:

        Raw data package

        Execution log with deviations (if any)

        Quality Gate:

        PMO/SE review for completeness and adherence to DOE design.

      5. Phase 5: Data Analysis & Interpretation Owner: Requester + SE/Technical Analyst Activities:

        Perform statistical analysis:

        ANOVA

        P-valuess evaluation

        Effect plots and interaction analysis

        Validate factors significance at 95% confidence level. Draw conclusions linked back to objectives.

        Propose engineering recommendations.

        Tools:

        JMP statistical software SPC charts (if applicable) Deliverables:

        DOE Analysis Report

        Engineering Recommendations Summary

      6. Phase 6: Reporting, Documentation & Continuous Improvement

        Owner:

        Requeste r Reviewe r: PMO

        Activitie s:

        Prepare a concise report containing: Results summary Statistical

        outputs Recommendati ons Lessons learned Archive:

        Req uest for m Raw data Rep orts

        JMP files

        Conduct a short after-action review with all involved teams.

        Capture data using verified instruments (e.g., thermocouples, datDa elloigvgerearsb)l.es:

        Maintain execution logs (operator, date, conditions).

        Final Experiment Report Archived

        experiment package

        Updated process improvements (if required)

    4. Supporting Components of the System

      1. Standardized Templates and Forms

        DOE Request Form (already developed in the article) Execution Log Data Capture Template DOE Report

        Template Lessons Learned Template

        These templates enforce unformity and reduce report writing time.

      2. Governance Structure

        CTO: Final approval authority

        PMO: Process owner ensuring compliance

        SE Team: Technical reviewers, maintain DOE standards DFMA/Tech Departments: Execution teams

        Quality Team (optional future integration): Validation and audits

      3. Knowledge Management

        Create a structured repository that includes: All historical experiments

        DOE templates Best practices

        Calibration documentation

    5. Expected Outcomes of the Proposed System

    • Reduced variance in experimental practices across departments.

    • Increased statistical credibility of engineering decisions.

    • Faster experimentation cycles due to standardized templates.

    • Improved cross-department collaboration via clear roles.

    • Better resource planning and reduced rework.

    • Documented knowledge retained within the organization.

    The proposed methodology transforms experimental activities from informal, experience-driven practices into a structured, auditable, and statistically robust system. By institutionalizing DOE principles, clearly defining roles, and integrating standardized templates into daily workflows, the organization can achieve more consistent results, improved project timelines, and greater overall R&D efficiency.

    This confirms that only two main factorsthermal paste and enclosure surface treatmentwere responsible for measurable changes in heater temperature.

    1. Model Adequacy

      The overall model showed a p-values of 0.0012, confirming strong statistical significance.

      Low error contribution indicates that nuisance factors (power fluctuation, instrument accuracy) did not materially influence results.

      The DOE model was appropriate, stable, and statistically valid.

    2. Impact of Factors Thermal Paste (Factor B)

    Exhibited the highest sum of squares. It was the most dominant factor in reducing heater surface temperature.

    Higher thermal conductivity paste resulted in better heat dissipation. Surface Treatment of Enclosure (Factor C)

    The black powder-coated aluminum enclosure improved radiative heat transfer.

    Provided measurable temperature reduction relative to untreated aluminum.

    Shunt Surface Finish (Factor A)

    Showed minimal influence on thermal performance

  4. RESULTS AND ANALYSIS

    A full factorial DOE was conducted to evaluate the effect of three factorsShunt Surface Finish, Thermal Paste Usage, and Surface Treatment of the Aluminum Enclosureon the thermal performance of a 10W heater. Eight randomized experimental runs were performed under controlled conditions (40°C chamber temperature, standardized humidity, and calibrated instruments).

    1. Statistical Significance

    Analysis of variance (ANOVA) and p-values testing revealed:

    P-Values:

    Figure 1 Experiment Data

    Thermal Paste (Factor B) Significant (p < 0.05)

    Surface Treatment of the Enclosure (Factor C) Significant (p < 0.05)

    Shunt Surface Finish (Factor A) Not Significant (p > 0.05) All interaction effects Not Significant (p > 0.05)

    Figure No 2 shows the P-values for each factor and the intersections between factors.

    Figure 2 Experiment Results -s P-Valuess

    Assuming a 95% confidence level (=0.05), the null hypothesis H0 is accepted for all factors and intersections except for thermal paste and surface finish of the aluminium enclosure, which have P-values less than 0.05. That means these factors significantly affect the temperature of the heater.

    ANOVA and Effect tests

    The P-values of 0.0012 of the overall models is shown in Figure confirms that the experiment has significant factors. This experiment’s error or noise (Uncontrollable factors, assumptionetc) is insignificant.

    Figure 3 ANOVA

    It is clear from the effect tests in Figure 3 that the thermal paste has the highest sum of squares and impact on the experiment. Increasing the conductivity of the thermal paste will improve the thermal behaviour.

    Figure 4 Effect tests

  5. DISCUSSION

    The implementation of a standardized Design of Experiments (DOE) framework addresses a critical gap in the organizations existing engineering workflow. Prior to this work, experimentation practices varied significantly among departments, leading to inconsistent documentation, subjective decision-making, and inefficiencies in resource allocation. By evaluating current practices and examining industry-accepted DOE methodologies, a unified and methodical process was developed to streamline experimental planning and

    execution.

    One of the key findings during the assessment phase was that most experimental activities were reactive rather than strategic. Engineers often initiated tests based on immediate project needs without a clearly defined problem statement, structured methodology, or measurable success criteria. This lack of preparation commonly resulted in inconclusive outcomes, repeated trials, and miscommunication between teams. The proposed DOE framework directly addresses these challenges by requiring engineers to specify objectives, inputs, factors, assumptions, and measurement techniques before executing any experiment. Such structured planning helps ensure that experiments are aligned with project goals and grounded in scientific reasoning.

    Another important aspect revealed through discussions with department heads is the absence of a centralized mechanism for monitoring and recording experimental activities. Each department maintained its own documentation practices, causing fragmentation of historical data and limiting opportunities for interdepartmental learning. The introduction of a standardized experiment request form and documentation template serves as a foundation for building a unified experimental knowledge base. Over time, this can significantly enhance data traceability, reduce redundancy, and support decision-making based on previous results.

    The review of DOE applications in other industries further highlighted the advantages of systematic experimentation, including improved process optimization, reduction of trial- and-error activities, and enhanced product reliability. These insights reinforce the need to adopt similar practices within the organization to improve operational efficiency. However, the transition to a standardized framework requires not only new tools but also a shift in organizational culture. Engineers must adapt to more formalized procedures and embrace a data-driven mindset. This may require training sessions, awareness workshops, and ongoing support from leadership to ensure successful adoption.

    Despite the positive impact of the proposed DOE process, several challenges may emerge during implementation. Resistance to change, variations in departmental priorities, and resource limitations could slow the adoption of new procedures. Additionally, some complex experiments may require advanced statistical methods or specialized tools that are not yet widely used within the organization. Addressing these challenges will require continuous feedback loops, refinement of the framework, and potentially the integration of digital systems to automate and simplify workflow management.

    Overall, the standardized DOE framework has the potential to significantly improve the reliability, reproducibility, and efficiency of experimental activities. By promoting a structured and scientifically grounded approach, the organization can reduce project delays, enhance engineering quality, and strengthen collaboration across departments. The successful implementatio of this framework will pave the way for further advancements, such as digital automation, centralized data management, and the incorporation of advanced analytics in future work.

  6. CONCLUSION

    The objective of this experiment was to identify a passive thermal management solution capable of improving the thermal behavior of the heater assembly. Three factors were selected for evaluation and comparison: shunt surface finish, thermal paste, and surface treatment of the aluminum enclosure. The results showed that only the thermal paste and the surface treatment of the aluminum enclosure had a statistically significant impact on the thermal performance. Among the examined factors, the thermal paste exhibited the highest sum of squares, indicating the strongest influence on heat transfer efficiency. Consequently, it is recommended that a thermal paste with higher thermal conductivity be used to further enhance the thermal behavior of the system.

    The organization operating this R&D environment has demonstrated strong potential within the engineering and technology sector. However, prior to this work, experimental activities were being conducted based largely on individual experience and departmental habits rather than a unified scientific methodology. Discussions with department heads revealed several recurring challenges, including inconsistent documentation, unclear experiment objectives, and a lack of standardized review mechanisms. The leadership of the organization recognized these issues and expressed strong interest in establishing a structured, scientific process for experiment management and execution. As a result, the development of a standardized workflow was conducted under direct supervision and guidance from senior technical leadership.

    Through multiple iterative sessions, engineering teams collaborated to design a comprehensive process flowchart that meets the practical needs of all technical departments while ensuring methodological rigor. Detailed guidelines were created to accompany each step of the flowchart, ensuring clarity, transparency, and ease of adoption for experiment requesters, reviewers, and execution teams. In addition to the process itself, a standardized Experiment Request Form was developed to ensure that each experiment begins with a clearly defined problem statement, measurable objectives, identified factors, constraints, resource requirements, and pre-execution checks.

    The robustness of the newly developed process was validated using a pilot experiment. This validation demonstrated that the standardized methodology improved communication, enhanced documentation quality, aligned expectations among stakeholders, and ensured consistency across experimental workflows. After successful evaluation, the process received final approval from organizational leadership and department heads and is now incorporated into the organizations official operational procedures

    • A standardized process greatly enhances communication and alignment across teams.

    • Clear definitions of roles, inputs, outputs, and review stages reduce misunderstandings and confusion.

    • It is difficult to create a single process that satisfies every department perfectly.

    • Flexibility and iterative improvement are essential for sustained adoption.

    • Soft skills play a critical role in gaining cooperation and ensuring smooth implementation.

    • Effective communication, negotiation, and stakeholder engagement are just as important as technical design.

    • Processes must always be tested and validated before organization-wide implementation.

    • Pilot trials help identify gaps and provide insight into real-world usability.

    • No process is ever fully mature; continuous refinement is necessary.

    • Feedback loops and periodic reviews help adapt the system to evolving organizational needs.

    • Subject-matter experts should be considered valuable assets, not expenses.

    • Their insights significantly improve process quality, reduce risk, and enhance long-term organizational capability.

  7. ACKNOWLEDGEMENT

    The author would like to express sincere gratitude to the engineering teams, department heads, and senior technical leadership whose guidance and collaboration were instrumental throughout the development of the standardized Design of Experiments (DOE) framework presented in this work. Their willingness to participate in interviews, share practical insights, and validate process requirements greatly contributed to shaping a methodology tailored to the organizations operational needs.

    Special appreciation is extended to the PMO and System Engineering teams for their role in reviewing statistical models, ensuring methodological rigor, and supporting multiple iterations of the workflow design. The successful validation of the DOE framework through the pilot experiment would not have been possible without the cooperation and dedication of the execution teams responsible for data collection and experimental setup.

    The author also acknowledges the organizational leadership for recognizing the need to establish a unified and scientifically grounded approach to experimental work, and for approving the integration of the proposed process into official operational procedures. Their support was essential in aligning technical objectives with strategic goals.

    Lastly, the author extends gratitude to all colleagues and contributorsdirectly or indirectlywhose expertise, encouragement, and constructive feedback helped refine this work and ensure its practical relevance for improving R&D efficiency and engineering excellence within the organization.

  8. FUTURE WORK

    Although the proposed Design of Experiments (DOE) framework establishes a structured and scientifically grounded approach to conducting experiments, several areas require further development to enhance its applicability and long-term effectiveness across the organization. The following points outline recommended directions for future work:

    Development of an Automated DOE Management System

    A digital platform or workflow management system can be developed to automate experiment requests, approvals, documentation, data collection, and report generation. This will reduce manual errors, improve traceability, and standardize processes across departments.

    Integration With Project Management Tools Future improvements may include integrating the DOE process with existing project management systems (e.g., ERP, PLM, or task management tools) to ensure experiment planning aligns with project timelines, resource availability, and budget constraints. This integration can also enhance cross-department visibility of ongoing experiments.

    Creation of a Centralized Experimental Data Repository Establishing a unified database for storing all experimental data, results, and reports will help engineers re-use historical insights, reduce redundant testing, and support data-driven decision-making. This repository could later integrate with analytics or AI-based tools.

    Advanced Statistical Analysis and AI Integration Future iterations of the DOE framework may incorporate advanced statistical methods, predictive modeling, and machine learning algorithms to improve factor selection, optimize experimental conditions, and predict outcomes without requiring repeated physical testing.

    Development of Training Programs and Certification Formal DOE training modules, workshops, and internal certification programs can be introduced to ensure engineers across all departments understand and properly apply the framework. A well-traned workforce ensures consistency, accuracy, and better experiment outcomes.

    Pilot Projects and Continuous Improvement

    The DOE framework should undergo pilot implementations in multiple departments. Feedback obtained from these pilots can guide iterative refinements, ensuring the process remains practical, scalable, and aligned with real engineering needs.

    Expansion to Non-Technical and Administrative Domains

    While the DOE system is initially focused on engineering experiments, future expansion could include quality assurance, operational processes, supply chain analysis, and even HR-related studiespromoting a data-driven culture across the entire organization.

    Periodic Review and Benchmarking With Industry Standards

    Annual or semi-annual reviews of the DOE process will ensure it stays aligned with global best practices. Benchmarking with leading R&D organizations can help identify new trends, methodologies, and tools that could be incorporated into the framework

  9. REFERENCES

  1. JMP Statistical Discovery, DOE Examples and Applications, 2022. [Online]. Available: https://www.jmp.com

  2. S. P. Nair and T. Nguyen, Modern Applications of Factorial Design for Thermal Optimization in Electronic Systems, International Journal of Thermal Sciences, vol. 179, pp. 115, 2022.

  3. A. Rehman, L. Lin, and M. Hussain, Application of DOE in R&D Environments for Process Optimization and Product Validation, IEEE Access, vol. 10, pp. 5501255025, 2022.

  4. Y. Zhao, H. Li, and R. K. Singh, Advanced Statistical Design for Multi- Factor Engineering Experiments: A Review, Journal of Manufacturing Processes, vol. 84, pp. 450462, 2022.

  5. M. Tahir, P. Kumar, and S. Lee, Integrating DOE With Digital Twin for Experimental Optimization in Mechatronic Systems, Sensors, vol. 23, no. 4, 2023.

  6. G. Andrews and R. Patel, Structured Experimentation for Thermal Performance Improvement Using DOE and ANOVA, Applied Thermal Engineering, vol. 221, pp. 120123, 2023.

  7. M. Al-Khlaif and B. Mohammed, Data-Driven Experimentation Frameworks in Modern R&D Organizations, IEEE Transactions on Engineering Management, vol. 71, no. 2, pp. 350364, 2024.

  8. K. Ibrahim and H. Sun, Full Factorial and Taguchi Designs for Electronic Enclosure Heat Optimization, Engineering Reports, vol. 6, no. 2, e12645, 2024.

  9. A. Chatterjee, M. Bergström, and A. Prakash, Optimizing Product Performance Using DOE and Machine Learning Hybrid Models, Expert Systems with Applications, vol. 231, 2023.

  10. R. Torres and L. Silva, Experimental Standardization in Multi- Department Engineering Settings, Quality Engineering, vol. 36, no. 1,

    pp. 112126, 2024.

  11. J. Omar and S. Ahmed, Effectiveness of Structured DOE Workflows for R&D Labs, IEEE Engineering Management Review, vol. 52, no. 3, pp. 6175, 2024.

  12. G. Wang, X. He, and C. Yu, Advanced ANOVA Techniques for High- Interaction Engineering Factors, Measurement, vol. 209, 2023.

  13. A. F. Malik and J. Seo, Thermal Management Optimization via Designed Experiments in Aluminum-Based Systems, Materials Today: Proceedings, vol. 69, 2023.

  14. K. Y. Lee and M. Park, Design of Experiments in Electromechanical System Development, Journal of Mechanical Science and Technology, vol. 37, pp. 441455, 2023.

  15. A. Rahman, Improving Engineering Decision-Making Through DOE- Driven Frameworks, Procedia CIRP, vol. 118, pp. 827833, 2023.

  16. S. Erdogan and P. Quinn, R&D Process Optimization Using Structured Statistical Methods, Journal of Industrial Engineering and Management, vol. 16, no. 4, pp. 7594, 2023.

  17. IEEE Std. 1722-2023, Recommended Practice for Data-Driven Engineering and Experimental Validation, IEEE, 2023.

  18. M. Köhler and D. Schmidt, Reproducibility Enhancement in Engineering Experiments Using Standardized DOE Protocols, Production Engineering, vol. 18, pp. 299309, 2024.

  19. B. Aslam and K. Riaz, Cross-Department Integration of Experimentation Workflows in Industrial R&D, Journal of Engineering and Applied Sciences, vol. 19, no. 1, pp. 5066, 2024.

  20. S. Yamashita, Continuous Improvement and DOE Integration in Technology Development, International Journal of Production Research, vol. 62, no. 14, pp. 41504165, 2024.

About Authors

Saleh is an engineer educated at ASU and KAU with

10 years of experience across design, integration, prototyping, and verification. His work spans optical instrumentation, multi-sensor systems (camera, LRF, thermal, SWIR), precision mechanical design, DFMA, and tolerance-driven manufacturing. Interests include multi- spectral imaging, range-sensing, ruggedized packaging, and reliability under shock/vibration, as well as applying data/AI to inspection and automation. He has contributed across the V-model from requirements to field validation.