Skip to content

Commit

Permalink
chore: simplify readme
Browse files Browse the repository at this point in the history
  • Loading branch information
Mah Neh committed Sep 20, 2024
1 parent c2dcba8 commit 11726e6
Show file tree
Hide file tree
Showing 3 changed files with 1,676 additions and 177 deletions.
104 changes: 36 additions & 68 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,111 +1,79 @@
# KerasTuner

[![license](https://img.shields.io/badge/License-Apache_2.0-green)](https://github.com/ghsanti/keras-tuner/blob/master/LICENSE)
![py-version](https://img.shields.io/badge/Python-3.11+-blue)
![py-version](https://img.shields.io/badge/Python-3.10+-blue)
[![tests](https://github.com/keras-team/keras-tuner/workflows/Tests/badge.svg?branch=master)](https://github.com/keras-team/ghsanti/actions?query=workflow%3ATests+branch%3Amaster)
[![codecov](https://codecov.io/gh/ghsanti/keras-tuner/branch/master/graph/badge.svg)](https://codecov.io/gh/ghsanti/keras-tuner)

Personal fork of the great KerasTuner, uploading in case others want to use it.
[![jax](https://img.shields.io/badge/jax-blue)](https://github.com/jax-ml/jax)
[![tf](https://img.shields.io/badge/tensorflow-yellow)](https://github.com/tensorflow/tensorflow)
[![pytorch](https://img.shields.io/badge/pytorch-orange)](https://github.com/pytorch/pytorch)

It was modified to fit personal purposes, but the goal is to make it easy to fix and improve.
Personal fork of the great [KerasTuner: Original Repo](https://github.com/keras-team/keras-tuner). Feel free to try it out.

## Try it / Install
## Install

You can run something like below, or try checking the code to a Codespace (it should automatically install everything.):

- Make a test dir: `mkdir test_tuner && cd test_tuner`
- Create a venv with Python 3.11 `python3.11 -m venv .venv && source .venv/bin/activate`
- Install stuff
Check out to Codespace, or install locally:

```bash
pip install git+https://github.com/ghsanti/keras-tuner
pip install tensorflow-cpu # or jax or torch; only tested tf and jax[cpu]
pip install jax[cpu] # or tf, or torch.
```

Then paste the code from `test.py` and see the results.

## Main Changes

- Logs epochs and executions by default, no averaging.
- Type Annotations.
- Backwards Incompatible: Drop most old stuff, rename functions, refactor (mostly metrics tracking.)

Yet, usage and output are very similar.

Below you will find the original authors, guides, links etc. All still valid.

---
Try [example.py](https://github.com/ghsanti/keras-tuner/blob/master/example.py) and see the results.

[Original GitHub repository](https://github.com/keras-team/keras-tuner).
<details>
<summary>
Main Changes
</summary>
- Detailed results and type annotations (some.)
</details>

KerasTuner allows you to leverage one of the available search algorithms to find the best hyperparameter values for your models.
KerasTuner comes with Bayesian Optimization, Hyperband, and Random Search algorithms built-in, and is also designed to be easy for researchers to extend in order to experiment with new search algorithms.
<details>
<summary>
Built-in algorithms:
</summary>
Find the best parameters using the the built-in algorithms:

## Quick links
- Bayesian Optimization,
- Hyperband,
- Random Search

- [Getting started with KerasTuner](https://keras.io/guides/keras_tuner/getting_started)
- [KerasTuner developer guides](https://keras.io/guides/keras_tuner/)
- [KerasTuner API reference](https://keras.io/api/keras_tuner/)
or extend in order to experiment with new search algorithms.

## Quick introduction
</details>

Import KerasTuner and TensorFlow:
<details>
<summary>
Code Example
</summary>

```python
import keras_tuner
import keras
# no need to import tf, it's the default.
# you have to install it though.
```

Write a function that creates and returns a Keras model.
Use the `hp` argument to define the hyperparameters during model creation.

```python
def build_model(hp):
model = keras.Sequential()
model.add(keras.layers.Dense(
hp.Choice('units', [8, 16, 32]),
activation='relu'))
model.add(keras.layers.Dense(1, activation='relu'))
model.add(keras.layers.Dense(1))
model.compile(loss='mse')
return model
```

Initialize a tuner (here, `RandomSearch`).
We use `objective` to specify the objective to select the best models,
and we use `max_trials` to specify the number of different models to try.

```python
tuner = keras_tuner.RandomSearch(
build_model,
objective='val_loss',
max_trials=5)
```
max_trials=5 # tries with the same parameters.
)

Start the search and get the best model:

```python
tuner.search(x_train, y_train, epochs=5, validation_data=(x_val, y_val))
best_model = tuner.get_best_models()[0]
```

To learn more about KerasTuner, check out [this starter guide](https://keras.io/guides/keras_tuner/getting_started/).
</details>

## Contributing Guide

Please refer to the [CONTRIBUTING.md](https://github.com/keras-team/keras-tuner/blob/master/CONTRIBUTING.md) for the contributing guide.

## Citing KerasTuner
- [Starter guide](https://keras.io/guides/keras_tuner/getting_started/).

If KerasTuner helps your research, we appreciate your citations.
Here is the BibTeX entry:
## Contributing Guide

```bibtex
@misc{omalley2019kerastuner,
title = {KerasTuner},
author = {O'Malley, Tom and Bursztein, Elie and Long, James and Chollet, Fran\c{c}ois and Jin, Haifeng and Invernizzi, Luca and others},
year = 2019,
howpublished = {\url{https://github.com/keras-team/keras-tuner}}
}
```
Please refer to the [CONTRIBUTING.md](https://github.com/ghsanti/keras-tuner/blob/master/CONTRIBUTING.md) for the contributing guide.
Loading

0 comments on commit 11726e6

Please sign in to comment.