Exploring Data Exchange with Rhino3D

Exploring Data Exchange with Rhino 3D: FLUX.io Workshop from Performance Workshops on Vimeo.

 

Advertisements

Parametric Patterns – Scaling

 

The module is scaled and arrayed within the model to produce variations of the facade pattern. By moving the number slider, the scale factor of both the inputed reference curves are changed. These scaled curves are then arrayed according to the scale factor. For example, if the original pattern is an array of 15 hexagons in the X axis and 9 in the Z axis, a scale factor of .5 would reduce these numbers by half. The result would be a pattern which much more dense.

Creating Beam Stuctures with Karamba

In order to find the most optimal cross section for the structural members that comprise the facade, I used a modified version of Junghwo Park’s example file from Karamba’s website.  The example file and tutorial video can be found here.

Karama_Result_01

Overall beam optimization results from Karamba

 

Karama_Result_Detail_01

 

 

Structural_Analysis_Karamba

Karamba Example File

The main component used is the Optimize Cross Section component which determines where in the model structural members need to be thicker or thinner.

 

Rhino + Revit Interoperability Workflow Using Live Objects

RH-RVT_Workflow_Diagram-01

Like a honeycomb structure, an interoperability workflow weaves software together to make an efficient system. In this example, we’ll create a live floor element in Revit from a surface in Rhino.

Objectives

1. Read surface by layer name and sort by elevation in Grasshopper
2. Extract parameters (curves, level names, number of floors, and elevations) and send to Flux
3. Create a level and floor element in the Flow Tool
4. Merge level and floor element in Revit

RH+RVT_Interop_GH_R01

 

Steps 1-2

 

RH-RVT_GH_01RH-RVT_GH_02RH-RVT_GH_03

We’ll start off with a rhino model that contains surfaces in a designated layer.
A single surface is read by index, its parameters extracted, then sent to Flux
(where the magic happens).

GH_06

• Dynamic Pipeline – Set to ‘read by layer name only’
• Number slider – Select surface by index
• Flux Project – Select project
• To Flux – Send data to Flux (flow control mode: constantly)

GH_07• Area centroid
• Deconstruct Point – Extract elevation (unit Z)
• Sort List – Sort surfaces by elevation
• Brep | Plane – Extract closed planar curves
• Insert Items – Construct list of text/numerical values for level names and number (individual ID)

 

Step 3

 

RH-RVT_Flux_Flow_01

• Create Level
• Create Floor – Requires closed input curves in a single list
The order of operations is levels first, floors second. That’s so the floor knows which level
it lives on!

RH-RVT_Flow_01

 

Step 4

 

Interoperability_HowTo_Page_5_Image_0001Interoperability_HowTo_Page_5_Image_0002Interoperability_HowTo_Page_5_Image_0003Interoperability_HowTo_Page_5_Image_0004

Using the handy Revit plugin, merge the level, and then, you guessed it, the floor.

 

N.B. If you’re feeling lazy, simply connect the data parameters to its respective
To Flux component and send all data at once 🙂

NB_GH

Shortest Path Voxelization

This is an alternative to voxelizing according to bending moment in the Surface Voxelization example and serves more as a representation of what Karamba does. This example takes a text file, created in Processing, which contains point information to extract a set of curves. The curves are shortest paths from a base point to the outer boundary of the 3D array input. These path curves are then used as the guides to determine where the subdivided voxels populate.

Grasshopper plugins required:

Python
MeshEdit

 

If you’d like the files, send me an email!

 

Part I

01_SP_Filepath.JPG

Step 1. For this example, we have an 3D array ready in the Surface_ShortestPath_Voxelization_01.3dm file– so first off, we need to extract the paths from the text file. Open the ImportExport-CK05.gh and find the path parameter.

 

02_SP_Filepath.JPG

Step 2. Right-click the path parameter and select ‘set one file path’. A new window will open. Navigate to the location of the AgentTrail.txt and open the file. This sets the file path to that parameter so Grasshopper can find it on your computer.

 

03_SP_Paths.JPG

Step 3. You can now see the generated curves if you preview the curve parameter. However, as you can see, its a bit shifted from the 3D array. But no worries, theres a quick fix for this. Bake the curves as a group so we can position them correctly.

 

04_SP_Position_Adjust1.JPG

05_SP_Position_Adjust.JPG

06_SP_Position_Adjust.JPG

07_SP_Mirror_Adjust.jpg

Steps 4-7. Go to top view in rhino, and make a line horizontal to the bottom edge of the 3D array (Step 5). Take the bottom most point of the group of curves and move it perpendicular to the horizonal line boundary until they intersect (Step 6). So, we just shifted the whole group of curves up so they fall within the 3D array. But to make it perfect, the final step is to mirror the whole thing. So take any point on the curves and mirror along the horizontal (Step 7).

 

08_SP_ExportPaths.JPG

Step 8. Take a look at your beautiful work and then go take a break 🙂

 

Part II

09_SP_Voxelization.JPG

Step 9. Welcome back! Open PolySubdivision_02.gh in the same rhino file. You will see all the needed inputs on the left as well as the output parameters near the bottom left.

 

10_SP_Array.jpg

Step 10. First, input the 3D array by right-clicking the parameter labeled 3D array.

 

11_SP_Inputs.JPG

Step 11. Input all the necessary parameters accordingly. The inputs are basically the same as in the Voxel_Tests_R05.gh file, with the exception of the path curves created in the previous part of the example and additonal large mesh layer with the base cubes.

 

12_SP_RunScript.jpg

Step 12. Run the entire script by clicking on the data dam, or the play button.

 

13_SP_VoxelResult.JPG

14_SP_BakeResult.jpg

15_SP_Voxelization

Steps 13-15. Preview the results by turning on the geometry previews. Bake out the meshes through the mesh parameter labeled ‘final mesh’. And there it is in Rhino, ready to be put back into Processing.