Jump to content
House Price Crash Forum

A techie above 35? You are most likely to be fired


Fairyland

Recommended Posts

0
HOLA441

This +40 age thing is becoming irrelevant in IT. It was the case 10 years ago, but the boomers have exited on fat pensions and GenXers with experience are in short supply and everyone has failed to train up Gen Y.

My firm fired a load of under 30s last year (team of 20) in the UK as they were massively underperforming. Literally, they achieved less than a couple of experienced 30-40 year olds. So, yes they were perhaps 1/3 the cost individually, but they were probably 1/10th as productive on the same basis. This is also seen in offshoring.

We also have the same thing in sales. I can hire a good 36-55 year old and they can generate £4m of revenue at 50% margin. The team of inside sales reps (about 16) all in their early 20s generated less than £800k all year at about 15 points of margin.

The cause of this is a) Gen Y are basically Generation Snowflake due to lack of constructive criticism growing up. b. IT generally has failed on training since the late 1990s, as has the wider U.K.  C) Degrees became less technical.

Going back perhaps a decade and a half after the Unix Wars, I also think that the downfall of Sun Microsystems was a big factor. They used to give away loads of stuff to Universities, after that everything became Windows with the odd outpost of Linux for the clever people. Whilst an easy to use GUI is great for productivity of the masses, it complete dumbs down the mind of future technical staff if they aren't forced to script. 

I have some hope that Cloud CLIs, APIs, Docker and IoT might fix things, but then in the UK we have 2 year olds who have been brought up to use tablets. I'm forcing my kids to use Linux and Android tools before they get anywhere near Windows, OSX and iOS.

 

 

 

 

Link to comment
Share on other sites

  • Replies 218
  • Created
  • Last Reply
1
HOLA442
6 hours ago, developer said:

Programming by itself is a very useful skill for making your own scripts and tools. But as soon as you make tools for other people as an employee you are selling yourself down the river. The only way to benefit from being a good programmer is to sell your own tools or build your own tools for some kind of business service type of purpose.

I don't regret being good at programming and having experience in it from programming a lot of personal projects. I think I will have a big advantage once I have enough capital to develop some interesting software or web applications. But I don't envy anyone going into it as a career. It's like designing a robot to replace you, great if you own the robot but terrible if your boss owns it.

There are two major forces putting programmers out of a job right now:

1) Monopolization of products and services ie modern capitalism.

2) Open source code.

(1) means a single web application/peice of software can serve the entire world and doesn't need to be rewritten- think Snapchat, Facebook, Android.

(2) The open source movement/ culture in the programming world cuts down development time significantly for the majority/ run of the mill programming tasks. There's a good chance someone has done what you need and open sourced it already.

The guys I know who are seriously into IT view programming skills as a soft skill to backup your core technical skills, whether that's sys admin, networking or IT security. But I think security is the most interesting of the three.

 

Becoming a programmer in IT is like deliberately choosing a job with less responsibility and if you aren't looking after assets in a capitalist system then you are no better than a receptionist or supermarket assistant. That's life unfortunately, the rewards are moving towards jobs which involve either managing or protecting capital hence why IT security is so important now.

 

Great post.  Onto your points:-

Quote

Programming by itself is a very useful skill for making your own scripts and tools. But as soon as you make tools for other people as an employee you are selling yourself down the river. The only way to benefit from being a good programmer is to sell your own tools or build your own tools for some kind of business service type of purpose.

You hit upon the key point: building tools for an employer means you'd better hope he wants you to keep building NEW tools in the future.  Each tool you build is essentially a one-off.  It's not like a factory with constant demand. Software tools can run for multi-decades (see how much COBOL still keeps the world functioning).    You might get to maintain the tool, but that's rarely a full-time role.  If maintenance is bug-fixing rather than improvements, then it's diminishing returns on your necessity within the company.  That's the beauty (and danger) of code - it's so efficient.  As you say, if you develop your OWN tools, you can resell them over and over to different clients (actually, this is essentially what I've been doing the last 13 years). 

Quote

 

There are two major forces putting programmers out of a job right now:

1) Monopolization of products and services ie modern capitalism.

2) Open source code.

(1) means a single web application/peice of software can serve the entire world and doesn't need to be rewritten- think Snapchat, Facebook, Android.

(2) The open source movement/ culture in the programming world cuts down development time significantly for the majority/ run of the mill programming tasks. There's a good chance someone has done what you need and open sourced it already.

The guys I know who are seriously into IT view programming skills as a soft skill to backup your core technical skills, whether that's sys admin, networking or IT security. But I think security is the most interesting of the three.

 

Absolutely.  Where's the market to develop a brand new operating system? A new search engine? New social platforms? New e-commerce / content management templates?   All of these markets have established solutions that have taken tens of thousands of developer years to develop. This is the evolution of code - we will ALWAYS arrive at this point.  It would be highly inefficient to have billions more lines of code written that do the same thing as existing software/apps.  Nobody's going to make the time/money investment to come up with a commercial competitor to Windows in 2017.  Why did Google use the Linux kernel to develop Android? Because even with the number of programmers at their disposal, they're not going to reinvent the wheel. That wheel would take many extra years of time to develop.  Any complex product is designed to be prototyped and templated. 

Sure, a new company could in theory reinvent the wheel in their niche, but as each year passes, and as each app/software codebase becomes more established, the barrier to entry gets higher.  There is no monopolisation like monopolisation in software/web apps. 

To sum up, being a programmer from here on out is not enough, even though it WAS enough even just 10 years ago.  You'll need to really specialise and dig deeper, or go it alone, have good all-round business and marketing skills, and sell (or use) your own niche toolsets to make a living.

Link to comment
Share on other sites

2
HOLA443
6 minutes ago, canbuywontbuy said:

Great post.  Onto your points:-

You hit upon the key point: building tools for an employer means you'd better hope he wants you to keep building NEW tools in the future.  Each tool you build is essentially a one-off.  It's not like a factory with constant demand. Software tools can run for multi-decades (see how much COBOL still keeps the world functioning).    You might get to maintain the tool, but that's rarely a full-time role.  If maintenance is bug-fixing rather than improvements, then it's diminishing returns on your necessity within the company.  That's the beauty (and danger) of code - it's so efficient.  As you say, if you develop your OWN tools, you can resell them over and over to different clients (actually, this is essentially what I've been doing the last 13 years). 

Absolutely.  Where's the market to develop a brand new operating system? A new search engine? New social platforms? New e-commerce / content management templates?   All of these markets have established solutions that have taken tens of thousands of developer years to develop. This is the evolution of code - we will ALWAYS arrive at this point.  It would be highly inefficient to have billions more lines of code written that do the same thing as existing software/apps.  Nobody's going to make the time/money investment to come up with a commercial competitor to Windows in 2017.  Why did Google use the Linux kernel to develop Android? Because even with the number of programmers at their disposal, they're not going to reinvent the wheel. That wheel would take many extra years of time to develop.  Any complex product is designed to be prototyped and templated. 

Sure, a new company could in theory reinvent the wheel in their niche, but as each year passes, and as each app/software codebase becomes more established, the barrier to entry gets higher.  There is no monopolisation like monopolisation in software/web apps. 

To sum up, being a programmer from here on out is not enough, even though it WAS enough even just 10 years ago.  You'll need to really specialise and dig deeper, or go it alone, have good all-round business and marketing skills, and sell (or use) your own niche toolsets to make a living.

Thanks, your reply is also very good. You put what I was trying to say into better words. Now these b!tches will pay attention. Wonder how many of the people disagreeing with us actually have experience developing their own tools and scripts or running a small business? I've done all three to some extent.

Most of the people not agreeing with us I would guess are 40+ and part of the generation which still caught the ride of easy job offers and promotions and are still sailing nicely on work experience on paper alone by now since they have been in IT over 10 years.

Come on I want to see someone under 25 come give their opinion. Sure the thread title is about over 35, but the overall topic is arguing about how employment opportunities in IT now compare to 15-20 years ago. Things ARE tougher now.

Link to comment
Share on other sites

3
HOLA444
15 minutes ago, canbuywontbuy said:

Great post.  Onto your points:-

You hit upon the key point: building tools for an employer means you'd better hope he wants you to keep building NEW tools in the future.  Each tool you build is essentially a one-off.  It's not like a factory with constant demand. Software tools can run for multi-decades (see how much COBOL still keeps the world functioning).    You might get to maintain the tool, but that's rarely a full-time role.  If maintenance is bug-fixing rather than improvements, then it's diminishing returns on your necessity within the company.  That's the beauty (and danger) of code - it's so efficient.  As you say, if you develop your OWN tools, you can resell them over and over to different clients (actually, this is essentially what I've been doing the last 13 years). 

Absolutely.  Where's the market to develop a brand new operating system? A new search engine? New social platforms? New e-commerce / content management templates?   All of these markets have established solutions that have taken tens of thousands of developer years to develop. This is the evolution of code - we will ALWAYS arrive at this point.  It would be highly inefficient to have billions more lines of code written that do the same thing as existing software/apps.  Nobody's going to make the time/money investment to come up with a commercial competitor to Windows in 2017.  Why did Google use the Linux kernel to develop Android? Because even with the number of programmers at their disposal, they're not going to reinvent the wheel. That wheel would take many extra years of time to develop.  Any complex product is designed to be prototyped and templated. 

Sure, a new company could in theory reinvent the wheel in their niche, but as each year passes, and as each app/software codebase becomes more established, the barrier to entry gets higher.  There is no monopolisation like monopolisation in software/web apps. 

To sum up, being a programmer from here on out is not enough, even though it WAS enough even just 10 years ago.  You'll need to really specialise and dig deeper, or go it alone, have good all-round business and marketing skills, and sell (or use) your own niche toolsets to make a living.

Thanks, your reply is also very good. You put what I was trying to say into better words. Now these b!tches will pay attention. Wonder how many of the people disagreeing with us actually have experience developing their own tools and scripts or running a small business? I've done all three to some extent.

Most of the people not agreeing with us I would guess are 40+ and part of the generation which still caught the ride of easy job offers and promotions and are still sailing nicely on work experience on paper alone by now since they have been in IT over 10 years.

Come on I want to see someone under 25 come give their opinion. Sure the thread title is about over 35, but the overall topic is arguing about how employment opportunities in IT now compare to 15-20 years ago. Things ARE tougher now

Link to comment
Share on other sites

4
HOLA445
14 hours ago, spyguy said:

Yes. But they are normally pretty static.

Once you start usin a website as a wb front end - order processing, kicking off events, HA, then it gets a lot more complex.

See:

 

From your link:-

Quote

To a company they were all graphic designers who produce static, non dynamuc web site. None had any capacity to handle any form of development, or any ideas on how to test their websites.

Your anecdote features a very bizarre web design company that aren't using any of the readily available content management templates out there that would have met the requirements. There's a massive, massive over-supply of cheapo web design companies who do use these templates (WordPress, Drupal, Joomla!, Magento, Shopify, et al).  I'm not even saying all of these solutions are great. Often they're massive overkill, especially if they're not efficiently setup (they host on a shared server, no caching), BUT it matters not to the client - they just see a nice looking site (often because the home page is nothing but a pretty photo) they can update themselves ... and more importantly to the web design company that have soft/low skills, it's easy to setup.  Templates have killed the web design/development industry because the barrier to entry to be a "web designer/developer" is so low - there's a massive over-supply of these businesses now. 

Link to comment
Share on other sites

5
HOLA446
2 hours ago, developer said:

Thanks, your reply is also very good. You put what I was trying to say into better words. Now these b!tches will pay attention. Wonder how many of the people disagreeing with us actually have experience developing their own tools and scripts or running a small business? I've done all three to some extent.

Most of the people not agreeing with us I would guess are 40+ and part of the generation which still caught the ride of easy job offers and promotions and are still sailing nicely on work experience on paper alone by now since they have been in IT over 10 years.

Come on I want to see someone under 25 come give their opinion. Sure the thread title is about over 35, but the overall topic is arguing about how employment opportunities in IT now compare to 15-20 years ago. Things ARE tougher now

Maybe I'm an anomaly, but I agree with all of your points and I'm in my mid-40s, have been a programmer for 35 years if I include my early years learning ZX Basic and Z80 assembler.  I've worked in large companies, small companies, and have been running my own small business the last 13 years, The pushback against our idea of "programming eating itself"  is almost exclusively coming from experienced developers working in large corporations who are specialising in one way or another.  Sure, there will always be a need for specialists in any industry (who will be well paid), but it doesn't stop the cull of the larger herd that existed fine even just 10 years ago.   This is because of the two points you highlighted (monopolisation of established codebases/software/apps, open source) - efficiencies inherent in code itself that are designed to be perfectly reused ad infinitum, with zero built-in/planned obsolescence (note that COBOL powers much of the banking, insurance and retail systems, pick your Google result from this search!).  There is no doubt that it's a lot tougher to get into IT today than it was 10 years ago, and certainly WAY tougher than 20 years ago.  "Specialise!" isn't an appropriate response to the original article and phenomena we are discussing - that can be applied to any industry, including accounting. I see this trend only continuing as the big players further entrench and consolidate their positions. 

Link to comment
Share on other sites

6
HOLA447
7
HOLA448
4 hours ago, canbuywontbuy said:

From your link:-

Your anecdote features a very bizarre web design company that aren't using any of the readily available content management templates out there that would have met the requirements. There's a massive, massive over-supply of cheapo web design companies who do use these templates (WordPress, Drupal, Joomla!, Magento, Shopify, et al).  I'm not even saying all of these solutions are great. Often they're massive overkill, especially if they're not efficiently setup (they host on a shared server, no caching), BUT it matters not to the client - they just see a nice looking site (often because the home page is nothing but a pretty photo) they can update themselves ... and more importantly to the web design company that have soft/low skills, it's easy to setup.  Templates have killed the web design/development industry because the barrier to entry to be a "web designer/developer" is so low - there's a massive over-supply of these businesses now. 

Almost but not quite.

The requirement, to put it in simplest terms, is for a Web frontend - think a WEB GUI to some Datatbase base driven application. Basically customer goes in, selects some options, enters some dimensions, details, doubles checks, generates a PDF for him, kicks off sales order. All done, no need for a human to get involved.

There's some form entry, then it has to go away and do some calcutations and generate a few files.

Literally all the web companies were just generating big, heavy, graphic orientated webpages. Great if its some sort of vanity project. Useless if wont wanted to actual code and sequeunce stuff.

Ive no problem with templates kiiling off webdesign - most of it was sh1t. Having something like bootstrap is great. Come with some nice, clear defaults. Hnadles layout, etc. Mix that biitstap qith HTML5 and theres something Ican work a bit with - unlike the previous attempts at HTML.

Link to comment
Share on other sites

8
HOLA449
5 hours ago, canbuywontbuy said:

From your link:-

Your anecdote features a very bizarre web design company that aren't using any of the readily available content management templates out there that would have met the requirements. There's a massive, massive over-supply of cheapo web design companies who do use these templates (WordPress, Drupal, Joomla!, Magento, Shopify, et al).  I'm not even saying all of these solutions are great. Often they're massive overkill, especially if they're not efficiently setup (they host on a shared server, no caching), BUT it matters not to the client - they just see a nice looking site (often because the home page is nothing but a pretty photo) they can update themselves ... and more importantly to the web design company that have soft/low skills, it's easy to setup.  Templates have killed the web design/development industry because the barrier to entry to be a "web designer/developer" is so low - there's a massive over-supply of these businesses now. 

I shoiuld say all the web companies we visited/went to visit, were bizarre.

They all were wierd. Full of women and bright young men. But just no depth to them.

Nice chairs though.

 

Link to comment
Share on other sites

9
HOLA4410
5 hours ago, canbuywontbuy said:

Great post.  Onto your points:-

You hit upon the key point: building tools for an employer means you'd better hope he wants you to keep building NEW tools in the future.  Each tool you build is essentially a one-off.  It's not like a factory with constant demand. Software tools can run for multi-decades (see how much COBOL still keeps the world functioning).    You might get to maintain the tool, but that's rarely a full-time role.  If maintenance is bug-fixing rather than improvements, then it's diminishing returns on your necessity within the company.  That's the beauty (and danger) of code - it's so efficient.  As you say, if you develop your OWN tools, you can resell them over and over to different clients (actually, this is essentially what I've been doing the last 13 years). 

Absolutely.  Where's the market to develop a brand new operating system? A new search engine? New social platforms? New e-commerce / content management templates?   All of these markets have established solutions that have taken tens of thousands of developer years to develop. This is the evolution of code - we will ALWAYS arrive at this point.  It would be highly inefficient to have billions more lines of code written that do the same thing as existing software/apps.  Nobody's going to make the time/money investment to come up with a commercial competitor to Windows in 2017.  Why did Google use the Linux kernel to develop Android? Because even with the number of programmers at their disposal, they're not going to reinvent the wheel. That wheel would take many extra years of time to develop.  Any complex product is designed to be prototyped and templated. 

Sure, a new company could in theory reinvent the wheel in their niche, but as each year passes, and as each app/software codebase becomes more established, the barrier to entry gets higher.  There is no monopolisation like monopolisation in software/web apps. 

To sum up, being a programmer from here on out is not enough, even though it WAS enough even just 10 years ago.  You'll need to really specialise and dig deeper, or go it alone, have good all-round business and marketing skills, and sell (or use) your own niche toolsets to make a living.

Well.... Google did not  create Android. They bought the company that created Android.

And, due to limitationso Android, which Im painfully aware off ...., Google are looking at replacing it

https://en.wikipedia.org/wiki/Google_Fuchsia

They are not starting from scratch, just another attempt at BeOs, took from the bloke who did NewOS.

Link to comment
Share on other sites

10
HOLA4411
6 hours ago, Mikhail Liebenstein said:

This +40 age thing is becoming irrelevant in IT. It was the case 10 years ago, but the boomers have exited on fat pensions and GenXers with experience are in short supply and everyone has failed to train up Gen Y.

My firm fired a load of under 30s last year (team of 20) in the UK as they were massively underperforming. Literally, they achieved less than a couple of experienced 30-40 year olds. So, yes they were perhaps 1/3 the cost individually, but they were probably 1/10th as productive on the same basis. This is also seen in offshoring.

We also have the same thing in sales. I can hire a good 36-55 year old and they can generate £4m of revenue at 50% margin. The team of inside sales reps (about 16) all in their early 20s generated less than £800k all year at about 15 points of margin.

The cause of this is a) Gen Y are basically Generation Snowflake due to lack of constructive criticism growing up. b. IT generally has failed on training since the late 1990s, as has the wider U.K.  C) Degrees became less technical.

Going back perhaps a decade and a half after the Unix Wars, I also think that the downfall of Sun Microsystems was a big factor. They used to give away loads of stuff to Universities, after that everything became Windows with the odd outpost of Linux for the clever people. Whilst an easy to use GUI is great for productivity of the masses, it complete dumbs down the mind of future technical staff if they aren't forced to script. 

I have some hope that Cloud CLIs, APIs, Docker and IoT might fix things, but then in the UK we have 2 year olds who have been brought up to use tablets. I'm forcing my kids to use Linux and Android tools before they get anywhere near Windows, OSX and iOS.

 

 

 

 

Theres a difference between someone 40 today and someone who was 40 20 years ago.

Some 20 years old, 40 years (mid 70s) ago would have not started coding, they would have fallen into it. Back then (40 years) ago there were not much in the way of standard languages, tools, libraries or OS. Everything was proprietary - even Unix was in those days. Windows did not exist. And computers were *VERY* expensive - multi-millions for somethign with less umph than your phone.

A 40 yera old today would have been 20 in 97 - Linux was in full swing, C ruled, TCP/IP/sockets were standard, SQL was well formed. All those in place.

The big events for todays software occurred in the late 80s - ANSI C book/spec (1988) and the 386 (first cheapish + fastish CPU with an MMU) Linux camse about as an attempt to use the 386 MMU. And Id also shove in the 4.4BSD (1990ish) which gave a reference implementation of TCP/IP

I my dealings with companies - not just in the UK - there's a wedge of people in the range of early 50s to mid 30s who seem to most ofthe complex software.

These are people who have spent most of their working life producing software.

Older than 55 then they just dont have exposure.

Younger than 35 and theyve bit ruined by both Java (which I regard as sh1t) and limited exposure to machine architecture.

Also affected by the 'Indians will do all the work' attitude at a lot of companies.

Have a look at QCON presentations and look at the ages of people.

 

 

Link to comment
Share on other sites

11
HOLA4412
36 minutes ago, spyguy said:

Well.... Google did not  create Android. They bought the company that created Android.

And, due to limitationso Android, which Im painfully aware off ...., Google are looking at replacing it

https://en.wikipedia.org/wiki/Google_Fuchsia

They are not starting from scratch, just another attempt at BeOs, took from the bloke who did NewOS.

Android....................yughhh

Full of security holes, so much so that they then lock out (non administrative) restricted user accounts from having any SD Card access.  This is a total bugger when the OS soaks up most of the 8 or 16GB storage and your kids want to take their cheap Droid tablets on holiday pre-loaded with content to keep them quiet on the plane.  You end up having to let them use the admin account - just make sure you password protect everything and don't let them watch you enter the pin code..

 

 

 

Link to comment
Share on other sites

12
HOLA4413

Anyone who says programming is boring maybe doesn't get the pleasure I get out of problem solving.  Yes, of course, if someone has given you the solution as pseudo code and you're just a code jockey, then I get it, but if you're presented with problems which you have to design and build solutions for, then it's pleasurable.  I'm in GIS as the only days I've not enjoyed are those where there's been no work and I get bored.  I've also had a period of 15 years with no unemployment outside of the odd month here and there, and sabbaticals where I go travelling with the kids and wife.  Contracting gives me that flexibility.  Due to the Niche area I am in, i get well paid, I am parachuted in, do my work, and I leave.  My work changes often and I've worked in most industries.  Last year, I was contracting abroad, and decided when they offered me my 2nd extension, not to take it, and decided to take the kids around Europe for 6 months.  I returned just before Xmas, took xmas off and now am workign at this clietnt until March 31st, where I leave for another company.

There is no way on earth that:
i) Being a permie would give me that flexibility

ii) I would have earned 30% of what I have contracting being a permie

iii) That I would have the same pension I currently have if I was permie

iv) Would have had the same standard of living that I have would I have been permie.

Life's been pretty good.  If it changes, I will, if not, I'll carry on.

But seriously, if you don't like problem solving, then IT maybe not for you, and that's all IT is really...

Link to comment
Share on other sites

13
HOLA4414
7 minutes ago, Mikhail Liebenstein said:

Android....................yughhh

Full of security holes, so much so that they then lock out (non administrative) restricted user accounts from having any SD Card access.  This is a total bugger when the OS soaks up most of the 8 or 16GB storage and your kids want to take their cheap Droid tablets on holiday pre-loaded with content to keep them quiet on the plane.  You end up having to let them use the admin account - just make sure you password protect everything and don't let them watch you enter the pin code..

 

 

 

I have an Adnroid table - as a lot on here know. Drives me nuts.

On its last legs - cannot hold a charge.

I am getting a new Android tablet, only because its the only OS that's  likely to be rootable.

Link to comment
Share on other sites

14
HOLA4415
1 minute ago, HairyOb1 said:

Anyone who says programming is boring maybe doesn't get the pleasure I get out of problem solving.  Yes, of course, if someone has given you the solution as pseudo code and you're just a code jockey, then I get it, but if you're presented with problems which you have to design and build solutions for, then it's pleasurable.  I'm in GIS as the only days I've not enjoyed are those where there's been no work and I get bored.  I've also had a period of 15 years with no unemployment outside of the odd month here and there, and sabbaticals where I go travelling with the kids and wife.  Contracting gives me that flexibility.  Due to the Niche area I am in, i get well paid, I am parachuted in, do my work, and I leave.  My work changes often and I've worked in most industries.  Last year, I was contracting abroad, and decided when they offered me my 2nd extension, not to take it, and decided to take the kids around Europe for 6 months.  I returned just before Xmas, took xmas off and now am workign at this clietnt until March 31st, where I leave for another company.

There is no way on earth that:
i) Being a permie would give me that flexibility

ii) I would have earned 30% of what I have contracting being a permie

iii) That I would have the same pension I currently have if I was permie

iv) Would have had the same standard of living that I have would I have been permie.

Life's been pretty good.  If it changes, I will, if not, I'll carry on.

But seriously, if you don't like problem solving, then IT maybe not for you, and that's all IT is really...

How much are PostGIs and QGIS part of your contracts?

Link to comment
Share on other sites

15
HOLA4416
Just now, spyguy said:

How much are PostGIs and QGIS part of your contracts?

QGIS not so much, but PostGreSQL/PostGIS more so. Last gig was PostGreSQL/PostGIS with customised QGIS front end, next gig PostGreSQL/PostGIS with .NET front end, current gig all MS and ESRI, although I'm also doing some discovery for new open source gear for them to use (Leaflet, Carto, gdal/ogr (python), OpenLayers and more).

 

Link to comment
Share on other sites

16
HOLA4417
1 minute ago, HairyOb1 said:

QGIS not so much, but PostGreSQL/PostGIS more so. Last gig was PostGreSQL/PostGIS with customised QGIS front end, next gig PostGreSQL/PostGIS with .NET front end, current gig all MS and ESRI, although I'm also doing some discovery for new open source gear for them to use (Leaflet, Carto, gdal/ogr (python), OpenLayers and more).

 

Thanks.

Link to comment
Share on other sites

17
HOLA4418
1 hour ago, spyguy said:

Almost but not quite.

The requirement, to put it in simplest terms, is for a Web frontend - think a WEB GUI to some Datatbase base driven application. Basically customer goes in, selects some options, enters some dimensions, details, doubles checks, generates a PDF for him, kicks off sales order. All done, no need for a human to get involved.

There's some form entry, then it has to go away and do some calcutations and generate a few files.

Literally all the web companies were just generating big, heavy, graphic orientated webpages. Great if its some sort of vanity project. Useless if wont wanted to actual code and sequeunce stuff.

Ive no problem with templates kiiling off webdesign - most of it was sh1t. Having something like bootstrap is great. Come with some nice, clear defaults. Hnadles layout, etc. Mix that biitstap qith HTML5 and theres something Ican work a bit with - unlike the previous attempts at HTML.

Looks like those companies offer half a solution (front-end only, don't know anything about DBs / server-side coding) when everyone wants a whole solution (an "easy to update website") - in an industry saturated with over-supply of whole-solutions.  I don't imagine any company would last very long offering static HTML websites post-1990s. 

Link to comment
Share on other sites

18
HOLA4419
8 hours ago, Mikhail Liebenstein said:

The cause of this is a) Gen Y are basically Generation Snowflake due to lack of constructive criticism growing up. b. IT generally has failed on training since the late 1990s, as has the wider U.K.  C) Degrees became less technical.

 

When I was at school/college we did Computer Science, now they do IT. 

For us it was all about understanding at the machine level on what was going on, having looked at various test papers for IT it's all about software and applications.

I'm not saying that's necessarily wrong, it teaches kids how to learn and interact with computers, however it's not even giving them a start in actually becoming a developer. It becomes a problem we people think we are giving kids a grounding in technology.

Link to comment
Share on other sites

19
HOLA4420
6 hours ago, canbuywontbuy said:

From your link:-

Your anecdote features a very bizarre web design company that aren't using any of the readily available content management templates out there that would have met the requirements. There's a massive, massive over-supply of cheapo web design companies who do use these templates (WordPress, Drupal, Joomla!, Magento, Shopify, et al).  I'm not even saying all of these solutions are great.

My opinion on this stuff is that it's not taking away jobs from developers, at least not to any meaningful level because it's serving a sector (small business) that would never be spending the level of money required to do this from scratch. 

Every man and his dog now has a website which they don't need. They have them because these sort of templating systems exist. 

Link to comment
Share on other sites

20
HOLA4421
19 minutes ago, gilf said:

When I was at school/college we did Computer Science, now they do IT. 

For us it was all about understanding at the machine level on what was going on, having looked at various test papers for IT it's all about software and applications.

I'm not saying that's necessarily wrong, it teaches kids how to learn and interact with computers, however it's not even giving them a start in actually becoming a developer. It becomes a problem we people think we are giving kids a grounding in technology.

My eldest is 13 and looking at Assembly language, Cryptology, how the computer actually works, including clock, fetch execute cycles, python, Java programming, ciphers, everything.  To be honest, as a 13 year old, she's doing work I started out doing in the first semester of my Computer Science degree from 25 odd years back.

Computer Science is still a subject, and it's what I would consider proper computer science, not computing, not IT.

Link to comment
Share on other sites

21
HOLA4422
13 hours ago, spyguy said:

There is limited oil exploration at the mo.

The maths would be Cuda on a Gpu, not a xeon.

Cuda is not always an option. If you want to just write standard scalable multi-threaded C++ code that will keep on giving performance improvements for the next decade you probably wouldn't want to tie yourself into Nvidia GPUs and the associated APIs. Besides, when considering the cloud as a potential host you wouldn't want to restrict yourself to NVidia GPUs as they won't necessarily be present.

I'm currently working with the next generation of software based HEVC codecs and they are slowly dropping Cuda support as (to quote one developer I spoke to) "it's a complete ball ache to support and maintain". They are eyeing up the scalability offered by the next few generations of Intel processors to run standard (well designed) C++ code.

Link to comment
Share on other sites

22
HOLA4423
40 minutes ago, gilf said:

My opinion on this stuff is that it's not taking away jobs from developers, at least not to any meaningful level because it's serving a sector (small business) that would never be spending the level of money required to do this from scratch. 

Every man and his dog now has a website which they don't need. They have them because these sort of templating systems exist. 

Its an example of what I was saying earlier - the pervasiveness of cheap/free solutions has created opportunities. The thing about any of the turn-key CMS type frameworks is that second you need to go even slightly off-piste requirements wise they become more trouble than they are worth so there is still plenty of work for web programmers.

Link to comment
Share on other sites

23
HOLA4424
13 minutes ago, Broken biscuit said:

Cuda is not always an option. If you want to just write standard scalable multi-threaded C++ code that will keep on giving performance improvements for the next decade you probably wouldn't want to tie yourself into Nvidia GPUs and the associated APIs. Besides, when considering the cloud as a potential host you wouldn't want to restrict yourself to NVidia GPUs as they won't necessarily be present.

I'm currently working with the next generation of software based HEVC codecs and they are slowly dropping Cuda support as (to quote one developer I spoke to) "it's a complete ball ache to support and maintain". They are eyeing up the scalability offered by the next few generations of Intel processors to run standard (well designed) C++ code.

Well, I dont think CUDA is the start and end.

Dont confuse threading/mutli-tasking with massive parallelism, which the idea behind CUDA is.

C++ memory is a relative new + clunky thing. Its still based a simple, single memory space.

CUDA is for massive parallelism, on lots of tiny, simple CPUs with their own fixed size float unit.

 

 

Link to comment
Share on other sites

24
HOLA4425

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.




×
×
  • Create New...

Important Information