mariusjabami commited on
Commit
16c7b3f
·
verified ·
1 Parent(s): 2b7e589

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -9
README.md CHANGED
@@ -17,13 +17,15 @@ tags:
17
  - transformers
18
  - open-source
19
  - causal-lm
20
- - lambdaindie
21
  ---
22
 
23
  # lambdAI — Lightweight Math & Logic Reasoning Model
24
 
25
  **lambdAI** is a compact, fine-tuned language model built on top of `TinyLlama-1.1B-Chat-v1.0`, designed for educational reasoning tasks in both Portuguese and English. It focuses on logic, number theory, and mathematics, delivering fast performance with minimal computational requirements.
26
 
 
 
27
  ## Model Architecture
28
 
29
  - **Base Model**: TinyLlama-1.1B-Chat
@@ -34,14 +36,15 @@ tags:
34
  - **Batch Size**: 20 per device
35
  - **Epochs**: 3
36
 
 
 
37
  ## Example Usage (Python)
38
 
39
  ```python
40
-
41
  from transformers import AutoTokenizer, AutoModelForCausalLM
42
 
43
- model = AutoModelForCausalLM.from_pretrained("lambdaindie/lambdai")
44
- tokenizer = AutoTokenizer.from_pretrained("lambdaindie/lambdai")
45
 
46
  input_text = "Problema: Prove que 17 é um número primo."
47
  inputs = tokenizer(input_text, return_tensors="pt")
@@ -49,18 +52,21 @@ inputs = tokenizer(input_text, return_tensors="pt")
49
  output = model.generate(**inputs, max_new_tokens=100)
50
  print(tokenizer.decode(output[0], skip_special_tokens=True))
51
 
52
- ```
53
 
54
- About Lambda
 
 
55
 
56
- Lambda is an indie tech startup founded by Marius Jabami in Angola, focused on AI-driven educational tools, automation, and lightweight software solutions. The lambdAI model is the first release in a planned series of educational LLMs optimized for reasoning, logic, and low-resource deployment.
57
 
58
- Stay updated on the project at lambdaindie.github.io and huggingface.co/lambdaindie.
59
 
60
 
61
  ---
62
 
63
- Developed with care by Marius Jabami — Powered by ambition and open source.
 
64
 
65
  ---
66
 
 
 
17
  - transformers
18
  - open-source
19
  - causal-lm
20
+ - lxcorp
21
  ---
22
 
23
  # lambdAI — Lightweight Math & Logic Reasoning Model
24
 
25
  **lambdAI** is a compact, fine-tuned language model built on top of `TinyLlama-1.1B-Chat-v1.0`, designed for educational reasoning tasks in both Portuguese and English. It focuses on logic, number theory, and mathematics, delivering fast performance with minimal computational requirements.
26
 
27
+ ---
28
+
29
  ## Model Architecture
30
 
31
  - **Base Model**: TinyLlama-1.1B-Chat
 
36
  - **Batch Size**: 20 per device
37
  - **Epochs**: 3
38
 
39
+ ---
40
+
41
  ## Example Usage (Python)
42
 
43
  ```python
 
44
  from transformers import AutoTokenizer, AutoModelForCausalLM
45
 
46
+ model = AutoModelForCausalLM.from_pretrained("lxcorp/lambdai")
47
+ tokenizer = AutoTokenizer.from_pretrained("lxcorp/lambdai")
48
 
49
  input_text = "Problema: Prove que 17 é um número primo."
50
  inputs = tokenizer(input_text, return_tensors="pt")
 
52
  output = model.generate(**inputs, max_new_tokens=100)
53
  print(tokenizer.decode(output[0], skip_special_tokens=True))
54
 
 
55
 
56
+ ---
57
+
58
+ About λχ Corp.
59
 
60
+ λχ Corp. is an indie tech corporation founded by Marius Jabami in Angola, focused on AI-driven educational tools, robotics, and lightweight software solutions. The lambdAI model is the first release in a planned series of educational LLMs optimized for reasoning, logic, and low-resource deployment.
61
 
62
+ Stay updated on the project at lxcorp.ai and huggingface.co/lxcorp.
63
 
64
 
65
  ---
66
 
67
+ Developed with care by Marius Jabami — Powered by ambition, faith, and open source.
68
+
69
 
70
  ---
71
 
72
+ ---