You openingopen a link to private Jupyter Lab in the cloudcloud and that authenticates you with your GCP account. From Jupyter Lab you are uploading data from the private file on Google Cloud Storage, to the Big Query (right from the Lab)Lab.) Later, you are executing several queries to BQ and visualizing data right from the Jupyter Lab. Finally, you are executing the training of the model with Tensor Flow and the data from BQ on the GPU, attached to the Jupyter Lab. Now you are uploading the trained model back to GCS. And all this without leaving the managed Jupyter Lab. Sounds like a magic, isn’tdoesn’t it? Now let me show you the screenshot of what you will have by the time you have finished reading this article:

The text above was approved for publishing by the original author.

Previous       Next

免费试用

请输入您的信息
请选择修正的语言

查看我们的 校对API服务.

eAngel.me

eAngel.me is a human proofreading service that enables you to correct your texts by live professionals in minutes.