Cohere announces $270-million USD Series C

Cohere announces $270-million USD Series C from Inovia, Nvidia, Oracle, Salesforce (Betakit)

Globe and Mail reported earlier

Cohere raising up to $250-million in Inovia-led deal valuing OpenAI rival at $2-billion

Artificial-intelligence company Cohere Inc. is in advanced talks to raise up to US$250-million from investors in a financing that could value the Toronto-based startup at just over US$2-billion.

Cohere, which develops language-processing technology, has been in discussions with chip maker Nvidia Corp. and investment firms about securing funds, according to two sources familiar with the matter. The round is being led by Inovia Capital, with partner Steven Woods, a former senior director of engineering with Google, steering the investment for the Montreal venture capital firm.

About Cohere and what they do and some background

The description from Globe and Mail leaves me wondering what else Cohere includes in their NLP. I have spoken to data scientists over the last 25 years and it was/ is science. The G&M leave something out and describe what some describe as fancy search.

Cohere is a natural language processing company, a branch of AI broadly devoted to improving the ability of computers to generate and interpret text. Cohere’s large language models (LLMs), the programs that do this work, have been trained to understand language by digesting essentially the entirety of the publicly available internet.

What I did appreciate is the description “Cohere aims to be a platform powering countless products and service” by “non-expert developers”. This comment resonates with what I have heard Nvidia’s Huang describe.

Origin of transformer model

I have listened and read enough to appreciate the importance of Transformers in ChatGPT 3. There is a paper entitled ’Attention is all you Need’. locally hosted at WordPress.

Gomez and his fellow researchers outlined a new method dubbed transformers. Rather than process words sequentially, transformers consider all previous words in a sentence when calculating the probability of the next one. Transformers deploy a mechanism called “attention” that essentially helps the model more accurately guess the meaning of a word based on those around it, parsing, for example, whether “bat” refers to the animal or the implement used to whack a ball.

In short the transformer method is described in the Globe as word based. Tho Chat GPT output does not know the overall meaning, rather it understands the logic of the words that comprise the output.

Conclusion

The core of the Cohere development framework is Natural Language Processing enhancements, significant enhancements that use data sets far larger than previously imaginable to produce infinitely greater quality of textual outputs.

This output in current models is used to provide outputs that surpass any previous efforts through machine learning.

Observation

The holy grail for me is still process enhancement, improvement and speed. Such improvement could then support the internal business processes of a Bank. That combination would be the minimum needed to take over human interaction.

The distinction to what I see in the transformer method would be working beyond the “next word” and rather decision logic based on chunks of data and words which together drive processes that are permissible within the guardrails of regulation and policy.

The root data sets would be differs and based on customer data, attributes and behaviours assessed alongside the constraints and opportunities within regulatory regimes.

I want to understand the possibilities for the next phases and moving beyond Chat and just how far off.

Tags #AI #AI-series #Aidan-N-Gomez #ChatGPT #transformers #transformer-method

2 thoughts on “Cohere announces $270-million USD Series C

Comments are closed.