Submitted by bilby_- t3_xvdlvp in MachineLearning

Low code ML tools offer a bigger population within an organization to take part in ML development, while enabling flexible customization for the technically savy DS.

Is this how the future looks like? Have you encountered a different way of enabling "non technical" users such as analysts or business users to develop or maintain ML?

2

Comments

You must log in or register to comment.

[deleted] t1_ir0kquz wrote

[deleted]

36

bilby_- OP t1_ir0ukpa wrote

When would one rather use wordpress and when not ?

−2

[deleted] t1_ir0yq19 wrote

[deleted]

7

crayphor t1_ir1h2df wrote

Exactly. Researchers often need to implement things from scratch, so we use more code-heavy solutions to have deep control. Someone putting these techniques into use may prefer just to load up a pre-trained model or run a training script on an existing architecture.

1

bilby_- OP t1_ir2dolh wrote

Yes. Fixed it, thanks

1

[deleted] t1_ir2e5ti wrote

[deleted]

2

bilby_- OP t1_ir2ey9h wrote

I work in a big mobile games company, and we have a scenario in which we have 3-4 models in production providing measurable lift in KPI's, and now we would want to scale the use of technology across the entire org.

1

ok531441 t1_ir0ew1c wrote

> Low code ML tools offer a bigger population within an organization to take part in ML development

Nope.

> while allowing flexible customization for the technical user.

Not at all.

So overall I would say no.

Seriously, who are the people in your organisation who know enough to work in ML but not enough to write code? (that's who the first part is supposedly enabling). And what no code tools give you better customisation than a programming language? (that claim is basically marketing nonsense)

11

bilby_- OP t1_ir0fsgp wrote

​

Low code tools can enable people who only know sql (some kind of analyst), to spin up and run a ML pipeline.

and if you look at tools such as dataiku and datarobot, they also allow the use of python to build more customized pipelines.

For example an insurance company can leverage a low code tool plus an analyst to test models such as predicting the probability for every elderly customer to buy life insurance and or for every young customer to purchase car insurance. The two inputs to the tool would be a population and target sql query. reading and writing to your dwh.

​

generally in the data science world you have personas who are more technical or less. more technical can leverage docker, kubeflow, cicd etc. others work with jupyter notebooks. and maybe in the future only sql.

2

dataslacker t1_ir0qfv8 wrote

In my experience sklearn/pandas is about the right level of “low code” in ML. Anything easier and you’re putting too much under the rug. Plus it’s free.

8

nogear t1_ir139x3 wrote

If you are creating ML models you should now the science - low code will not save you from biases and other pitfalls.

Imagin letting a database guy doing a ML model, then use the probability for business decisions - just to find out, that your model is crap and highly biased.

And if you know the science you usually should be able to code with Python / Pandas / Sklearn ...

I am not against good and mighty tools - but typing the code is the simplest part in ML...

3

Stats_Fast t1_ir29j0m wrote

>For example an insurance company can leverage a low code tool plus an analyst to test models such as predicting the probability for every elderly customer to buy life insurance and or for every young customer to purchase car insurance.

Insurance companies don't run their business like the marketing examples for low code environments.

They employ lots of highly qualified people with a range of skills across math/statistics/economics/legal/programming areas. Writing basic python isn't the constraint.

2

cantfindaname2take t1_ir2vwe7 wrote

IMHO low code is fine for not resource intensive tasks. Once an analysts (without too much programming skills) starts running into performance or memory problems then that person will have a hard time adjusting their pipeline because the low code tools usually offer little flexibility.

1

mystic12321 t1_ir0gi85 wrote

I don't think that low code tools are the future of ML development.

What's common for the most of low code tools is that they deliver lots of abstractions and basically hide everything under a set of some assumptions. Where those assumptions come from? I would say that usually from research papers and research papers are very far from solving real world problems.

Maybe in the past it was enough to just take a big model trained by tech giant and apply it directly to a problem, but those times are gone, because we have already solved most of those "easy" problems. As the ML/AI is entering new domains and industries, the complexity of problems grows. These problems need custom and complex solutions, something that can only be delivered by domain expert working together with ML expert. And low code tools just won't be able to deliver them what they need most - flexibility.

Where low code ML tools might be useful? Very early in the POC/prototyping stage that might be done by a person without strong coding / ML skills.

Have you encountered a different way of allowing "non technical" users such as analysts or business users develop ML? Not really, but that should be limited to the stages I mentioned above, for the very simple reason - ML development is very complex and without proper education and training it will lead to low quality solutions. AI is affecting many aspects of our lives and I personally don't want to be surrounded by low quality AI - eg. an AI that has biases or just gives absurd results for some very rare cases.

10

dataslacker t1_ir0pgi5 wrote

To your last point, often what people don’t seem to understand is that coding is the easy part of ML development. It’s very rare you’ll find a person who is capable of understanding the details of a model and it’s pitfalls etc, but not able to code in python.

9

Kroutoner t1_ir0vv2d wrote

Even in the rare cases that you do find someone like this, two weeks and a general python tutorial plus a framework tutorial will probably me more than enough for them to start contributing.

7

nibbels t1_ir0rpxn wrote

You're posting in r/machinelearning, why would you expect anyone to say yes? That said, low code tools will probably gain traction when companies think they can use them instead of hiring engineers. But, most serious projects won't be done with low code tools. Most serious projects will continue to be done by mega-companies and fancy research labs.

4

space-ish t1_ir0rdpm wrote

Was gonna agree when I realized that the people in an org using low code tools are the DS experts themselves. Not because they can't code, but somewhere in the chain of command a decision was made to buy and implement said tool.

2

HateRedditCantQuitit t1_ir17eu2 wrote

It’s impossible to say what the future of such a fast moving field looks like without specifying a timescale. Will low-code ML replace coded ML stuff in the next year or two? Probably not. What will the main industrial applications of ML be in 5-6 years? Shit moves so fast it’s hard to say.

2

Charming-Fishing3155 t1_ir50t3m wrote

Yes. As machines are getting cheaper and Data scientists are getting more expensive and hard to find, it is always better to use no code tools for ML development.

​

The idea is not the low code itself, but how to find the best model and deploy it and monitor it. I would argue that this is done via experimentation, and this could be automated.

If we define the ML life cycle as:

​

  1. Understand the business problem, Map the business problem to ML
  2. Get the data
  3. Prepare the data
  4. Feature engineering
  5. Feature selection
  6. Model training (train, test, optimize)
  7. Model deployment
  8. Model monitoring.

There are tools today that can do 3,4,5,6,7,8 automatically. In some cases (e.g. 1000 models) a data scientist simply does not have time to code the models. Also, imagine that you have only 10 models, but get new training data every week, are you going to code new models every week?

​

The reasons that these tools are not in wider use are:

​

  1. AI / ML is not widespread (outside of tech)
  2. The tools are (very) expensive.
  3. The tools mainly focus on training and lack deployment and monitoring.(at least the open source one).

​

At the end of the day, businesses care about productivity, I.e. can you get as many models into production in the short amount of time possible.

2

Clicketrie t1_irakxge wrote

I think there's a lot of ways that low-code is going to augment ML too. Software like Roboflow and Comet ML used in conjunction with whatever you're building in python is sorta hybrid and I think there's a ton of opportunity for that. I think in general the more cookie cutter problems will be automated and there will always be a use case for the more innovative stuff.

1

Smoothie17 t1_ir13lwf wrote

AI already can code better than a human, the future will be interesting.

−2

TrueEqualFalse t1_ir1h1gf wrote

Most S.O.T.A LLM’s that you see with fancy demos doing Leetcode problems have small context windows and are unable to write code for large code bases, so it is unlikely they are able to make an meaningful contributions

2