Some screenshots of initial studies done with Ladybug’s Sunlight Hours analysis tools. Since this first set of studies, the pattern has increased in scale, which would change the amount of sunlight that reaches the facade during the day.
Note: Analysis done for the winter and summer solstices, during a 6 hour analysis period.
In order to find the most optimal cross section for the structural members that comprise the facade, I used a modified version of Junghwo Park’s example file from Karamba’s website. The example file and tutorial video can be found here.
The main component used is the Optimize Cross Section component which determines where in the model structural members need to be thicker or thinner.
Like a honeycomb structure, an interoperability workflow weaves software together to make an efficient system. In this example, we’ll create a live floor element in Revit from a surface in Rhino.
1. Read surface by layer name and sort by elevation in Grasshopper
2. Extract parameters (curves, level names, number of floors, and elevations) and send to Flux
3. Create a level and floor element in the Flow Tool
4. Merge level and floor element in Revit
We’ll start off with a rhino model that contains surfaces in a designated layer.
A single surface is read by index, its parameters extracted, then sent to Flux
(where the magic happens).
• Dynamic Pipeline – Set to ‘read by layer name only’
• Number slider – Select surface by index
• Flux Project – Select project
• To Flux – Send data to Flux (flow control mode: constantly)
• Area centroid
• Deconstruct Point – Extract elevation (unit Z)
• Sort List – Sort surfaces by elevation
• Brep | Plane – Extract closed planar curves
• Insert Items – Construct list of text/numerical values for level names and number (individual ID)
• Create Level
• Create Floor – Requires closed input curves in a single list
The order of operations is levels first, floors second. That’s so the floor knows which level
it lives on!
Using the handy Revit plugin, merge the level, and then, you guessed it, the floor.
N.B. If you’re feeling lazy, simply connect the data parameters to its respective
To Flux component and send all data at once 🙂
I wanted to try out how easy it would be to convert the Grasshopper file used to create points from a text file, created with Processing, into Flux’s flow tool.
The main difference from the Flow Tool and the Grasshopper file is that the Polyline component creates the five distinct lines from the text file, while in the Flow Tool, that block is not available, and so the paths are individual lines.
One advantage is that you can easily share the result with anyone using a link to the data key.
This is an alternative to voxelizing according to bending moment in the Surface Voxelization example and serves more as a representation of what Karamba does. This example takes a text file, created in Processing, which contains point information to extract a set of curves. The curves are shortest paths from a base point to the outer boundary of the 3D array input. These path curves are then used as the guides to determine where the subdivided voxels populate.
Grasshopper plugins required:
If you’d like the files, send me an email!
Step 1. For this example, we have an 3D array ready in the Surface_ShortestPath_Voxelization_01.3dm file– so first off, we need to extract the paths from the text file. Open the ImportExport-CK05.gh and find the path parameter.
Step 2. Right-click the path parameter and select ‘set one file path’. A new window will open. Navigate to the location of the AgentTrail.txt and open the file. This sets the file path to that parameter so Grasshopper can find it on your computer.
Step 3. You can now see the generated curves if you preview the curve parameter. However, as you can see, its a bit shifted from the 3D array. But no worries, theres a quick fix for this. Bake the curves as a group so we can position them correctly.
Steps 4-7. Go to top view in rhino, and make a line horizontal to the bottom edge of the 3D array (Step 5). Take the bottom most point of the group of curves and move it perpendicular to the horizonal line boundary until they intersect (Step 6). So, we just shifted the whole group of curves up so they fall within the 3D array. But to make it perfect, the final step is to mirror the whole thing. So take any point on the curves and mirror along the horizontal (Step 7).
Step 8. Take a look at your beautiful work and then go take a break 🙂
Step 9. Welcome back! Open PolySubdivision_02.gh in the same rhino file. You will see all the needed inputs on the left as well as the output parameters near the bottom left.
Step 10. First, input the 3D array by right-clicking the parameter labeled 3D array.
Step 11. Input all the necessary parameters accordingly. The inputs are basically the same as in the Voxel_Tests_R05.gh file, with the exception of the path curves created in the previous part of the example and additonal large mesh layer with the base cubes.
Step 12. Run the entire script by clicking on the data dam, or the play button.
Steps 13-15. Preview the results by turning on the geometry previews. Bake out the meshes through the mesh parameter labeled ‘final mesh’. And there it is in Rhino, ready to be put back into Processing.
This example takes a surface and voxelizes it according to the bending moment analysis. The resulting meshes can then be used in Processing.
Grashopper plugins required:
If you would like access to the files, send me an email and I’ll be more than happy to share them with you 🙂
Step 1. Input the reference surface, within the 03_GH_INPUT_MESH layer, in Rhino by right-clicking on the mesh parameter in Grasshoper and select ‘set one mesh’. Do the same for the brep parameter for the support boundary, under 03_SUPPORT_BOUNDARY, which is a closed polysurface near the bottom of the surface.
Step 2. Once inputed, the Karamba bending moment analysis will run. You can preview the results or bake the colored curves from the analysis by right-clicking on the bake component (this step requires the Human GH plugin to be installed on your computer). Points are generated from the resulting curves, which are the points used to populate the voxels in the next step.
Step 3. Turn on the 00_CUBE layer to reveal the next inputs for voxelization. We will be voxelizing with cubes in this example, but the file provided also contains other voxel types to try out.
Step 4. First, input the 3D array of cubes under the 00_CUBE_3D_ARRAY layer. Notice that there is a data dam here, which is like a play button– we’ll come back to this at the end when we want to run the whole script.
Step 5. The main principle of this voxelization is that the higher the bending moment, the more subdivided the voxel becomes. So we input two different voxel types that are populated on the points from the bending moment analysis. They are named accordingly— 00_CUBE_MESH_M goes inside the geometry parameter labeled Med Sub (subdivision) and so on.
Step 6. Now, we go back to the play button from Step 4 and click on it to run the script. Voila! You have voxelized the surface!!
Step 7-8. Bake the resulting meshes inside the mesh parameter labeled ‘final meshes to bake’. You can also preview your work by turning on the custom preview to the right of this parameter.