Create your first .Net Core 5.0 WebApi
In our today’s blog post, we will be creating a DotNet Core-based application from scratch, targeting a cross-platform environment. The steps described below can be easily reproducible in any Linux or Windows environment.
DotNet Core 5.0
Microsoft finally released the new .NET 5 at the .Net Conf, November 2020.
DotNet Core is the unified platform for building cross-platform applications (Windows, Linux) and runs on edge devices and IoT. The aforementioned brings increased performance, cost-saving, and an additional option in the running DotNet-based apps on Kubernetes, which wasn't the case for the legacy applications built with the DotNet-Framework.
Follow with me!
In our today’s blog-post, we’ll use the DotNet Core CLI to create from scratch a project. The prerequisites are as follows:
- SDK: DotNet Core 5.0 Software Development Kit,
- Editor: Visual Studio Code, VIM,
- Container Builder: Podman (low footprint, no daemon needed), or in worst cases use Docker ;)
Create Solution File
Let’s start by creating a new solution file where to house our “green-wave” project. Let’s start by opening a command terminal in an empty directory and execute the following command:
dotnet new sln -n green-wave
Now let’s create our Web API application based on a DotNet Core template.
dotnet new webapi -n green-wave
The command generates a new WebApi Application project named “green-wave” and a project skeleton including all the needed files. A specific file structure is created and which includes the following:
- “Startup.cs”: Contains all the settings and configurations,
- “Program.cs”: Contains the method “Main”, which is the entry point of the ASP DotNet Core application,
- “green-solution.csproj” defines what libraries are referenced,
- “green-solution/Pages” directory holds example web pages for the application.
The project is detached and not known by our main solution yet. An easy way to link the project to the solution is by launching an “add” command, as follows:
dotnet sln green-wave.sln add ./green-wave/green-wave.csproj
Following best software craftsmanship principles, we can create a test project to exercise our code. Once done, we add the test project to the solution file “green-wave.sln”:
dotnet new xunit -n green-wave.Test
dotnet sln green-wave.sln add ./green-wave.Test/green-wave.Test.csproj
The last thing we want to do is referencing the green-solution Api project inside the test project to be able to launch Unit tests against our APIs. To do so, we need just to execute the following:
dotnet add ./green-wave.Test/green-wave.Test.csproj reference ./green-wave/green-wave.csproj
Build/Test the Solution
We’re done with creating the projects. Let’s ensure they build and tests with success:
Great, Now let’s run our application locally and expose it on TCP port 8080.
dotnet run -p ./green-wave/green-wave.csproj \
NB. By default, the application is exposed on localhost to HTTP over TCP Port 5000, and HTTPS over TCP Port 5001.
Now, let’s check whether the Web API is behaving as expected by launching a simple CUrl command.
curl -k http://localhost:8000/weatherforecast?location=detroit
You can use Postman to perform the same or advanced API calls:
As a final step, let’s see how we can get our artifacts out for the next steps. Within the same terminal window launch the publish command:
dotnet publish -c Release -o publish
Different files are created automatically under the “publish” directory, as shown below:
- "green-wave.deps.json": Application runtime dependencies file...
- "green-wave.dll": This is the framework-dependent deployment version of the application, which we're going to use as entrypoint to our application
- "green-wave.exe": Executable version
- "green-wave.runtimeconfig.json": Application's runtime configuration file (DotNet version to use)
The C# compiler (named Roslyn) used by the
dotnetCLI tool converts the written C# source code into an intermediate language (IL) code and stores the IL in an assembly (DLL or EXE file).
That’s all for today. In a future post, we will discuss automating the build process through the usage of a multi-staged Dockerfile; and running the resultant container from a Pipeline. Stay tuned!
Thank You for Reading Me!