研究生: |
劉得崙 Liu, Te-Lun |
---|---|
論文名稱: |
將非結構化3D網格模型轉換為參數可編輯性B-Rep格式的雙路徑結構抽象法 Dual-Path Structural Abstraction for Converting Unstructured 3D Meshes into Editable B-Rep Format |
指導教授: |
李哲榮
LEE, CHE-RUNG |
口試委員: |
洪仕軒
Shih-Hsuan Hung 王昱舜 Yu-Shuen Wang |
學位類別: |
碩士 Master |
系所名稱: |
電機資訊學院 - 資訊工程學系 Computer Science |
論文出版年: | 2025 |
畢業學年度: | 113 |
語文別: | 英文 |
論文頁數: | 35 |
中文關鍵詞: | 形狀分析 、參數化線與面模型 、形狀重建 、線狀面狀分類器 、參數化幾何圖形 |
外文關鍵詞: | Shape analysis, Parametric curve and surface models, Shape reconstruction, Curve-surface classifier, Parametric geometry |
相關次數: | 點閱:39 下載:4 |
分享至: |
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
我們提出了一種新穎的雙路徑結構抽象框架,用於將非結構化3D網格模型轉換為與CAD環境(如Rhino Grasshopper)相容的可編輯B-Rep表示。我們的方法解決了設計工作流程中的一個關鍵瓶頸:現有3D資產缺乏語義結構和參數控制。在3D部件分割的基礎上,我們通過引入曲線-曲面分類器,我們自動將網格組件區分為類曲線(curve-like)和類曲面(surface-like)類別,分別應用1D骨架化和2D中軸變換的技術,萃取模型的幾何特徵元素,並將這些特徵元素用於重建為可編輯的基於NURBS的B-Rep組件。搭配Grasshopper的參數畫編輯介面,我們將錨點感知操縱器與部件尺寸編輯串聯進生成的B-Rrep模型中,實現直觀的零件級變形和參數編輯。使用真實世界家具模型的大量實驗證明了我們管道在減少手動重建工作的同時保持語義連貫性和可編輯性的能力。我們的方法彌合了生成式3D內容與結構化參數建模之間的差距,為基於網格資產的互動設計、逆向工程和幾何感知重用提供了堅實基礎。
We present a novel dual-path structural abstraction framework for converting unstructured 3D mesh models into editable B-Rep representations compatible with CAD environments such as Rhino Grasshopper. Our method addresses a critical bottleneck in design workflows: the lack of semantic structure and parametric control in existing 3D assets. By introducing a curve-surface classifier, we automatically distinguish mesh components into curve-like and surface-like categories, applying 1D skeletonization and 2D medial axis transformations, respectively. These abstractions are then reconstructed as editable NURBS-based B-Rep components with anchor-aware manipulators, enabling intuitive part-level deformation and parametric editing. We also develop a tightly integrated Grasshopper toolkit that supports modular editing, geometric control, and downstream fabrication readiness. Extensive experiments with real-world furniture models demonstrate our pipeline’s ability to reduce manual reconstruction effort while preserving semantic coherence and editability. Our approach bridges the gap between generative 3D content and structured parametric modeling, offering a robust foundation for interactive design, reverse engineering, and geometry-aware reuse of mesh-based assets.