π
Working everywhere
Machine Learning Engineer @huggingface
-
Hugging Face
-
08:47
(UTC -05:00) - https://huggingface.co/
- in/marc-sun
- @_marcsun
Pinned Loading
-
huggingface/transformers
huggingface/transformers Publicπ€ Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
huggingface/accelerate
huggingface/accelerate Publicπ A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (including fp8), and easy-to-configure FSDP and DeepSpeed support
-
Finetune GPTQ model with peft and tlr
Finetune GPTQ model with peft and tlr 1# coding=utf-8
2# Copyright 2023 The HuggingFace Inc. team. All rights reserved.
3#
4# Licensed under the Apache License, Version 2.0 (the "License");
5# you may not use this file except in compliance with the License.
-
optimum
optimum PublicForked from huggingface/optimum
π Accelerate training and inference of π€ Transformers and π€ Diffusers with easy to use hardware optimization tools
Python
-
huggingface/optimum-quanto
huggingface/optimum-quanto PublicA pytorch quantization backend for optimum
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.