ele

ele

ele

Come and get the benefits, Tencent Cloud Cloud Studio big benefits!!!

Correction: After testing, the author found that the Cloud Studio high-performance basic version does not support external services for some reason, meaning that by default, it can only be operated through its command line.
As shown in the figure:
EC48EC26-1A07-4B30-86E6-E3DBAA0D61D2

It cannot provide API services, but other machines and Hai machines have been tested by the author and are supported.

Tencent Cloud is indeed a conscientious cloud service, and this wave of benefits is definitely worth taking advantage of.
Today, let's introduce our main character:
Tencent Cloud Cloud Studio

Introduction:

Cloud Studio is a browser-based integrated development environment (IDE) that provides developers with a continuous cloud workstation. Users can program online anytime and anywhere without installation when using Cloud Studio.

5B3BB8EB-A7B2-4F0E-849B-051DC3EFBA55

As an online IDE, Cloud Studio includes basic IDE features such as code highlighting, auto-completion, Git integration, and terminal, while also supporting real-time debugging and plugin extensions, helping developers quickly complete the development, compilation, and deployment of various applications.
Online IDE interface
image

For a more detailed introduction, you can check here official documentation

Why mention this? Because Tencent Cloud provides developers with a lot of free experience time.
Initially, it offered 3000 minutes (general),
the high-performance version offered 10000 minutes, and now the general duration has directly increased to 50000 minutes. Alright, without further ado, let's get started.
Today's tutorial will teach you how to deploy large models using Cloud Studio through Ollama.

Step 1: Register for a Tencent Cloud account. There's not much to say here; just follow the prompts to register and verify your identity. It’s very quick.
Step 2: Open our main character, Tencent Cloud Cloud Studio
The interface is as follows:

CA4FED14-715E-420D-A931-454F4706400F

Step 3: Create a new space.
As shown in the figure, first click on the left menu for high-performance space, then click on "Create New".

8DFC11C6-2F58-4916-8220-4644625E531B

Some may wonder, the banner above already states that you can directly create Ollama and that it’s the hottest DeepSeek, so why go through the trouble of explaining how to use Ollama to create it? This is to teach you how to fish rather than just giving you fish. If you know how to install Ollama, then you will also know how to install and use other large models later. Alright, let’s continue.

89C5498A-D9D3-47E3-9536-3F3D13EFDD7E

Choose the appropriate specifications, click "Next", and start creating the machine. It may take a few minutes, after which it will automatically redirect to the IDE page.
Since there has been another wave of resources released today, I directly selected the high-performance space in the lower left corner, after all, it is a GPU server, yes, a free experience GPU server.

Step 4: Install Ollama.
Then, in the editor's menu bar, select Terminal ==> New Terminal.

18F5AC3E-5DA1-4DBB-8489-B67060BD9825

In the lower right area of the editor is the opened terminal, where we can enter normal Linux commands. We will install Ollama, this large model deployment framework. You can install Ollama with just one command.

2A7E44FE-E7A4-440B-A89E-090FBA70B1F6

curl -fsSL https://ollama.com/install.sh | sh

Copy the command above to the terminal interface, press Enter, and the automatic installation will begin.

After a moment, the installation should be successful.
The interface may vary, but the commands are the same.
Once installed successfully, we enter ollama --help in the terminal.
Due to network issues, I will directly use the one that has already been downloaded.

A30426E7-BF04-44D4-9372-47AC56C35635

The commonly used commands are:
ollama run xxxx to download and run the xxx model.
ollama stop xxxx to stop running the xxx model.
ollama list to list all downloaded large models.
ollama ps to list currently running large models.
ollama rm xxxx to delete a large model (stop it first, then delete).
For others, you can refer to the documentation.
Ollama supports mainstream open-source large models; you can see the official website for specifics.

D0226684-8F3F-4FC8-9E63-9765E69D46A9

Come get the benefits, Tencent Cloud Cloud Studio big benefits!!! One

Come get the benefits, Tencent Cloud Cloud Studio big benefits!!! Two

Come get the benefits, Tencent Cloud Cloud Studio big benefits!!! Three

Come get the benefits, Tencent Cloud Cloud Studio big benefits!!! Four

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.