AI

sparklyr.sedona: A sparklyr extension for analyzing geo...

We are excited to announce the availability of sparklyr.sedona, a sparklyr exten...

sparklyr 1.7: New data sources and spark_apply() capabi...

Sparklyr 1.7 delivers much-anticipated improvements, including R interfaces for ...

torch: Just-in-time compilation (JIT) for R-less model ...

Using the torch just-in-time (JIT) compiler, it is possible to query a model tra...

Train in R, run on Android: Image segmentation with torch

We train a model for image segmentation in R, using torch together with luz, its...

Beyond alchemy: A first look at geometric deep learning

Geometric deep learning is a "program" that aspires to situate deep learning arc...

Pre-processing layers in keras: What they are and how t...

For keras, the last two releases have brought important new functionality, in te...

Revisiting Keras for R

It's been a while since this blog featured content about Keras for R, so you mig...

Deep Learning with R, 2nd Edition

Announcing the release of "Deep Learning with R, 2nd Edition," a book that shows...

Community spotlight: Fun with torchopt

Today, we want to call attention to a highly useful package in the torch ecosyst...

torch outside the box

Sometimes, a software's best feature is the one you've added yourself. This post...

TensorFlow and Keras 2.9

New TensorFlow and Keras releases bring improvements big and small.

Introducing the text package

The text package attempts to provide user-friendly access and pipelines to Huggi...

luz 0.3.0

luz version 0.3.0 is now on CRAN. luz is a high-level interface for torch.

Audio classification with torch

Learn how to classify speech utterances with torch, making use of domain knowled...

Five ways to do least squares (with torch)

Get to know torch's linalg module, all while learning about different ways to do...

torch 0.9.0

torch v0.9.0 is now on CRAN. This version adds support for ARM systems running m...

This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies.