Clear Filters
Clear Filters

How to create a 3D world and simulate a virtual camera in MATLAB on MacOS?

42 views (last 30 days)
Hello,
I need to create a 3D world given a 3D model (STL, OBJ, etc.) and simulate a virtual camera in MATLAB.
The goal is to create a synthetic dataset where it contains image frames from the virtual camera inside the scene. The user would provide the intrinsics and the poses for the virtual camera.
I'm currently researching if MATLAB has this capability and I have found the following:
I think this would work well except that Simulink 3D is not supported on Mac, so I don't have this option.
I have found the following link below which is very similar to what I need, but I'm having some trouble finding options to specify camera intrinsics to the view. It's possible to provide intrinsics in a view?
Does anyone has any other ideas?
Thank you!

Accepted Answer

George Abrahams
George Abrahams on 6 Jul 2024 at 9:24
To extend @Umar's answers, to go from image points to an actual image made of up pixels rather than something more like a scatter plot, you need to perform rasterization. For this, you can use the rasterize function of my 3D Rendering Toolbox on File Exchange (which I'm aware you already found - thank you for your comment!). Be aware that the function creates an RGB and depth image, but doesn't currently handle lighting.
Just pay attention to the format of the vertices input. X and Y are in image space, i.e. measured in pixels. Z is in world space, i.e., the distance from the camera along the Z-dimension, measured in world units. This is how we know which face is visible (closest to the camera) at a given pixel when multiple faces are overlapping.
To get the X and Y coordinates of the image points, as @Umar stated, you can use either MATLAB's built-in Camera functions or the functions I provide in my toolbox - whatever you prefer. For the Z coordinates, if you don't use my functions, you'll have to do the coordinate transformations and calculate it for yourself.

More Answers (3)

Umar
Umar on 4 Jul 2024 at 23:58
Hi Yeray,
Creating a 3D world and simulating a virtual camera in MATLAB can be achieved using various techniques. While Simulink 3D is not supported on Mac, there are alternative methods to accomplish this task.One approach is to use MATLAB's Computer Vision Toolbox, which provides functions for camera calibration, 3D reconstruction, and camera pose estimation. You can create a 3D scene using a 3D model and then simulate a virtual camera within this scene. Here's a step-by-step guide to help you achieve this, Import your 3D model (e.g., STL, OBJ) into MATLAB using appropriate functions like stlread or readObj.Define the camera intrinsics (e.g., focal length, principal point, distortion coefficients) and extrinsics (camera pose) for the virtual camera. You can use the cameraParameters object to store camera intrinsics.Use MATLAB's Computer Vision Toolbox functions like projectPoints to project 3D points onto the image plane of the virtual camera. This step will generate the synthetic image frames.Display the rendered images to visualize the output. You can use MATLAB's plotting functions or imshow to view the synthetic dataset. By following these steps, you can create a 3D world and simulate a virtual camera in MATLAB without relying on Simulink 3.
  1 Comment
Yeray
Yeray on 5 Jul 2024 at 16:48
Thank you for your answer! While I understand the workflow that you're proposing, I'm not able to find the function projectPoints in the Computer Vision Toolbox. Would you be able to provide a link to it?
In addition, I'm assuming that "projectPoints" act very similar if not equal to world2img. This would provide image coordinates of the world points given camera pose and intrinsics. These world points would obviously be the nodes of the 3D model, but how would you go about generating a complete synthetic frame? I'm imagining it would look more like a 2D scatter plot with the image coordinates as outputs.

Sign in to comment.


Umar
Umar on 5 Jul 2024 at 20:12
Hi Yeray,
What I meant to say that you can create a generic function projectPoints utilizing computer vision toolbox. Also, please understand that "projectPoints" function is not a standard function in the Computer Vision Toolbox provided by MathWorks. However, the functionality of projecting 3D points onto a 2D image plane is commonly achieved using functions like "projtform2d" and "transformPointsForward" in MATLAB.
For more information regarding functions projtform2d and transformPointsForward, please refer to
https://www.mathworks.com/help/images/ref/projtform2d.html
https://www.mathworks.com/help/lidar/ref/loampoints.transformpointsforward.html
To perform point projection in computer vision applications, I will provide an example code snippet that demonstrates how to achieve this:
>> % Define camera calibration parameters focalLength = 1000; principalPoint = [320, 240]; imageSize = [640, 480]; intrinsics = cameraIntrinsics(focalLength, principalPoint, imageSize); extrinsics = [rotationMatrix translationVector]; cameraParams = cameraParameters('IntrinsicMatrix', intrinsics.IntrinsicMatrix, 'ExtrinsicMatrix', extrinsics);
% Define 3D points worldPoints = [x1 y1 z1; x2 y2 z2];
% Project 3D points onto image plane imagePoints = worldToImage(cameraParams, worldPoints);
% Display projected points on image imshow(image); hold on; plot(imagePoints(:,1), imagePoints(:,2), 'ro');
Please bear in mind that cameraIntrinsics requires Computer Vision Toolbox. Let me break down the code for you.
let's break down the code step by step:
Camera Calibration Parameters:
focalLength = 1000; sets the focal length of the camera. principalPoint = [320, 240]; defines the principal point (the point where the optical axis intersects the image plane). imageSize = [640, 480]; specifies the size of the image. intrinsics = cameraIntrinsics(focalLength, principalPoint, imageSize); creates intrinsic camera parameters using the provided values. Extrinsics and Camera Parameters:
extrinsics = [rotationMatrix translationVector]; represents the extrinsic parameters of the camera, including rotation and translation. cameraParams = cameraParameters('IntrinsicMatrix', intrinsics.IntrinsicMatrix, 'ExtrinsicMatrix', extrinsics); constructs the camera parameters object using intrinsic and extrinsic matrices.
Defining 3D Points:
worldPoints = [x1 y1 z1; x2 y2 z2]; defines the 3D points in the world coordinate system.
Projecting 3D Points onto Image Plane:
imagePoints = worldToImage(cameraParams, worldPoints); projects the 3D world points onto the 2D image plane using the camera parameters. Displaying Projected Points on Image:
imshow(image); displays the image. hold on; ensures that subsequent plot commands do not overwrite the image. plot(imagePoints(:,1), imagePoints(:,2), 'ro'); plots the projected image points as red circles on the displayed image.
In summary, this code snippet simulates the process of calibrating a camera, defining 3D points, projecting these points onto the image plane using camera parameters, and visualizing the projected points on an image which is a fundamental step in computer vision applications like object tracking, augmented reality, and 3D reconstruction.
While there isn't a direct "projectPoints" function in the Computer Vision Toolbox, sorry for misunderstanding, you can achieve similar functionality using the aforementioned methods.

Yeray
Yeray on 7 Jul 2024 at 6:04
Thank you all for the answers!

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!