Win a copy of Secure Financial Transactions with Ansible, Terraform, and OpenSCAP this week in the Cloud/Virtualization forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Ron McLeod
  • Paul Clapham
  • Jeanne Boyarsky
  • Bear Bibeault
Sheriffs:
  • Rob Spoor
  • Henry Wong
  • Liutauras Vilda
Saloon Keepers:
  • Tim Moores
  • Carey Brown
  • Stephan van Hulst
  • Tim Holloway
  • Piet Souris
Bartenders:
  • Frits Walraven
  • Himai Minh
  • Jj Roberts

Would most normal Python programmers think I have a static stick up my call stack?

 
Ranch Foreman
Posts: 338
8
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I understand that Python is a dynamic language and will always remain so.

I also see that most places that depend on Python for Production seem to be embracing the static typing paradigm for larger pieces of code, and that the Python team has been putting in ever-increasing amounts of work to enable this opt-in.

I am gradually becoming more comfortable with many Pythonisms (and not just references to Cheese Shops and Parrots).

I was doing some HackerRank problems practicing stuff I think I know.

I saw this in the intro to one of them:

>>> a = "this is a string"
>>> a = a.split(" ") # a is converted to a list of strings.
>>> print a
['this', 'is', 'a', 'string']



Now, a couple of years back I got down to final two at a fascinating job that had all sorts of languages in their mix, we spent a lot of time talking about GoLang and Rust....

They also had some significant amounts of Python.

I mentioned "I really don't like that a variable can change its type.  That creeps me out."

The interviewer responded "Oh gosh, that's just horrible, of course you should NEVER do that, just because the language allows it."

I think it is nifty and neat that you can use the same language for small, throw-away scripts (or even at a REPL) and real production stuff that gets code-reviewed and checked-in and unit tested and what-not.

I read one place that said "If you would write unit tests for something, you should probably think about having static type-checking in place" as a guideline.

I noticed someone here on the forums who worked with a lot of medium-scale Python stuff and found it odious, he converted much of it to Java if I recall correctly.

He had two big complaints.

1. The indentation vs. { } thing was only cute in small-scale, in larger programs he detested it.
2. Arrrrgh -- figuring out what types everything was when he was in new areas of code was absolutely maddening -- once he did it, he didn't want to do it again, and took the opportunity offered to convert this stuff he had to maintain to Java...

His first issue isn't bothering me yet.  I find it interesting that not a whole lot of languages have followed in Python's Semantic Indentation footsteps, but I am cool with it.

His second issue seems like it might not be a big deal anymore, you could be one of those code teams that opts-in to Static type-checking (after you agree on which competing system to use, I guess)...

I do hate seeing code that actually re-purposes variables to hold different unrelated types at different times tho -- that is going way too far on any code that won't fit on a screen, or would get checked-in in my opinion.

For now, I am going to follow the advice that "Just because a language lets you do something that you think is a bad idea doesn't mean you should ever do it" and completely avoiding what I consider the abuse of dynamic typing abilities in any code I write, even toy code and practice examples.

There are other things specific to Python I now appreciate, including 'keyword parameters' which freaked me out when I first saw them in C#, because they expose internals right thru the interface.  But they are so darn convenient when used in ways like print() uses sep and end.
 
Saloon Keeper
Posts: 23438
159
Android Eclipse IDE Tomcat Server Redhat Java Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Jesse Silverman wrote:I do hate seeing code that actually re-purposes variables to hold different unrelated types at different times tho



But isn't that the very essence of polymorphism?

I admit that it's taking things to extremes though. Kind of like Java if it had no primitives and all variables were declared type java.lang.Object

Back in the days of Fortran and COBOL, a common gripe was that "constants - aren't and variables - won't". Any in my opinion the fact that that's also true in Python is one of its more dangerous features. In extreme cases, it's likely to lead to the use of "magic numbers" and that's definitely a step back into the past.

The indentation thing is actually rather neat - providing you never ever use physical tab characters. It may not be seen in in many other programming languages, but it is an essential characteristic in YAML, where it cotributes greatly to the terseness of its form.
 
Master Rancher
Posts: 3831
50
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
As a non-python person who only occasionally dabbles, I agree with your reactions to the two Python complaints.  I have never understood what problem people have with python indentation; I think it's a great idea, and wish it caught on elsewhere.  But re-using a variable to be a different type seems just wrong.  I guess if I ever do more with Python, I should check out the static type-checking - it's one of the things that I most like about Java (and Scala and Kotlin).  It will be interesting to see if your feelings remain the same over time, or if you become more pythonic and find yourself rebelling against the straightjacket of strong typing.
 
Tim Holloway
Saloon Keeper
Posts: 23438
159
Android Eclipse IDE Tomcat Server Redhat Java Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Almost forgot - there's always Hungarian notation. It's worthless for compiler enforcement, but at least you have a visual reminder.

The downside to Hungarian notation to me was that I'd often change datatypes as I shaped an app, so that an int might become a boolean, which might expand into an enum, and so forth, and thus the type hint encoded in the name quickly became useless. That was back before refactoring IDEs, though, and they can handle the renaming without a lot of tedious search-and-replace work.
 
Jesse Silverman
Ranch Foreman
Posts: 338
8
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I'm going to set myself down as someone who used to strongly like Hungarian notation and have now been swayed by the notion that even when things don't change, we are often in an IDE these days and it just uses up precious characters and screen real estate.

I came up thru BASIC/FORTRAN/ASSEMBLER/(PL/I)/Pascal and C in environments where the editor (and come to think of it, the proto-debuggers!) had zero knowledge about the language.
It absolutely, definitely positively served a useful purpose then, but now not so much.

C and Java 8 (as featured on HackerRank and Android) are the only languages lots of people still use (okay, lots of people that I work with that still use) that don't have type inference in most of the places that need it.  So when people deride static typing as meaning you need to declare every last thing explicitly, it sounds like complaining about something that barely exists any more.

My strongest feeling about Python at this moment is that before I was thinking only about programs I *did* write (or had to maintain) and saying "They'd be slower and less type-safe in Python!"

That was true, but now that I am doing it I am thinking about all the programs I could have written that would have been fun and interesting that I didn't bother to because it was just too much hassle in C/C++ or even Java or C# (partially because I didn't know them well enough at the time to be fast) or Perl 5 (because I didn't use it often enough to remember well).  I could have thrown them together quickly in Python, and if they really seemed useful, I could always make Industrial-Strength versions in one of those languages.  Unless they made use of some nifty Python libraries that didn't have direct equivalents I knew about, another reason I might not have written them.

All of these languages have their positives and negatives and places where they are great or not so great to use.  If I had infinite time I'd learn them all.
In practice I have competing tendencies to want to know "everything" about languages I use overriding the fun of learning "all of them" and focusing on things I am likely to be able to buy tempeh and seitan with.

On my own question, I am leaning towards saying I understand why the Python Posse resists the risk of trying to turn Python into C# or Java or Scala, and appreciate that they have put much hard work into a couple of evolutions towards opt-in strong type safety.

As I learn more about the language I see more things that are truly more C/C++ like than Java, e.g. id(), sys.getsizeof() operator overloading, etc. etc. and I have already stopped freaking out over writing 'self' a LOT when working with classes/objects.  The fact that someone outside the class can add or delete members to my objects still seems like leaving the doors to the house unlocked, but, whatever.

I am actually not even using any of the static type features at this time, but knowing they are there and that people are working on them makes me feel secure in that everything I am learning isn't wasted effort, you can probably safely work on large projects with multiple developers applying the correct tools (still evolving) and discipline (which hardly seems like discipline coming from years of C/C++/Java etc.)

For some reason, I keep thinking about how Bruce Eckel went from being a C++ guru, thru Java and then Python to wind up a Scala author.  All of C++, Java and Python have evolved a lot since he mostly (to my knowledge) moved on from them.

In particular, the quote about how dang hard it was to work with files, directories, etc. "I wrote one of the most popular Java books, and every time I have to play with files and dirs I need to go look it up!" (paraphrase)  I believe predates NIO.2 and various other nice things that have happened in Java since then.  I've seen Java talks where they reduced like 36 lines of Java 5 code finding and reading a file to like 5 lines of Java 8....

Back to Python adventures!
 
Tim Holloway
Saloon Keeper
Posts: 23438
159
Android Eclipse IDE Tomcat Server Redhat Java Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
For quite a long while, when I needed something quick-and-dirty on Unix-like environments, I'd use Perl. But Perl can get really gnarly. Of course, Windows users had Visual Basic, but VB wasn't available for Linux and that's one reason that Python arose. It's the Unix-equivalent of VB. Rexx was nice, but not really found except on OS/2 and the Commodore Amiga, Unless you had a mainframe.

Now I usually use Python - unless the job is really regex-heavy, anyway. Or for web code, JavaScript. Or PHP, but more rarely these days. And on AVR devices, where I spend a lot of time these days, C++.

But for Industrial-grade design and programming, I want Java. Java is too heavy for off-the-cuff stuff, but the very principles and services that weigh it down for trivial apps are very important when you need something to run at scale in a complex and possibly hostile environment.

That's been the fault of the "big hit" world-famous applications. They often got hacked out in JavaScript, PHP, or Python (or formerly, Ruby on Rails), and as they got more successful, they showed the liabilities of poor performance, poor reliability and poorer security. Their owners, however, had by then invested so much in the platform, that it was easier to live with it and hope. Or try to develop tools that would do externally what strongly-typed langaiges do Internally.



 
Jesse Silverman
Ranch Foreman
Posts: 338
8
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
While I think that a lot of what did Perl in was not calling Perl 6 (now Raku) "Ultimate Infinity Language" (in 1999 my most Perl-loving friend said ":Maybe you should just wait a year and do Perl 6 instead" because I was finding "Programming Perl 2nd edition tough going)...in 2010 my new fresh-out-of-my-Engineering-School programmer started working with me and once again, told me "Perl 6 is almost here and it will be great"...I am not sure it was really production ready in 2019...

One thing Perl did have working against it was a *perception* of being hard to read, partially deserved.
I remember Ken C.R.C. Arnold of Curses/Rogue fame (and has done some Java stuff) saying he'd been doing Perl for years and whenever he looks at someone else's Perl he was like "What is this??  Is this even Perl??"

I posted on LinkedIn after a HackerRank-in-Python binge...

Python seems to be married to the idea that programming should be fun and easy and look normal. Perl used to think it should be fun and easy and look weird.



It isn't original, as Guido has called Python Perl without the weird syntax in the past, but still...it was what I was feeling.

I had an Atari ST rather than an Amiga, so all of my ReXX use was on OS/2 and Mainframe...I remember when I got to my first job and saw ReXX everywhere.
My first question was "Isn't that from the Amiga??  Is this AReXX?"

You are probably right about Java still being awesome for huge projects.
I guess the main competitors are Scala (not sure if Kotlin is being used a lot besides Android) Go and Rust now?
C++ is great for people who've spent huge amounts of time on it for many years, but I feel like even a couple of hundred of hours of studying could leave someone in a dangerous place there.  I don't think it is unfair to characterize it as having a million ways to blow yourself up.  Running Coverity/Klocwork/etc. on code you think is rock solid is often eye-opening....

C# gets maligned here a lot because it was neither truly open-sourced in a meaningful way nor truly cross-platform in a meaningful way until quite recently, but even tho every time I switch back and forth between them I get annoyed at the false equivalency people make between them, I guess in the larger scheme of all possible languages, they are somewhat similar, kind of in the same space.

The most inconvenient and annoying languages still seem smooth sailing compared to what programming meant to me for the first 10, or probably, 20 years.  But the Python I was writing today...felt like I was writing it as fast as I could type almost, and there was very little typing!
 
Tim Holloway
Saloon Keeper
Posts: 23438
159
Android Eclipse IDE Tomcat Server Redhat Java Linux
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
You cannot truly appreciate Perl as a "write-only" computer language until you start working with complex collections and aggregates and the "&"s, "@"s and "$"s start flying. Alas, that's where I was at the end.

I think that it was Alan Kay who called Java "C++ without the Mace and Knives", and I think it's largely accurate. C++ has never been that big a problem to me, but then I was one of the first to promote it in the PC world. Apparently a lot of people have problems with its memory/object management, though, and that's in large part what Rust was created to address. We've learned a lot about OOP and OS low-level programming since C++ was created.

Kotlin's success is due at least in part to the fear that Oracle will make "Java" (Dalvik) unusable on the Android platform, so Kotlin allows you to design for Android without potential disruption from lawyers.

It's ironic that both Sun and Microsoft owned both OS's and programming languages, but Sun's language was designed from the get-go to be independent of its OS (Solaris) with "write-once/run anywhere". Microsoft, on the other hand, designed J++ with the explicit intent of hijacking Java into being a Windows-only programming language and .Net was designed with a heavy dose of OS-native packages that apps might be required (or at least seduced) to dip into and thus become non-portable overnight. And, of course, Microsoft is notorious for its lack of backwards compatibillity, whereas Java on the other hand has always been scrupulous to maintain,  up to and including its deprecation mechanism that allows you to make emergency repairs to obsolete code without having to stop in the middle of a panic and do a total rewrite.

So C# isn't avoided because it's deficient in technical abilities, but because it's hostage to Microsoft.
 
Jesse Silverman
Ranch Foreman
Posts: 338
8
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
While it is clear that not everyone has the same level of faith that they have put greedy/evil/harmful practices firmly and permanently behind them, the level and extent to which Microsoft has embraced cross-platform (at least Windows/MacOS/Linux) and open-source seems to be extremely significant to me.

It is slightly off-topic for this thread, but, probably pretty important:
from:
https://en.wikipedia.org/wiki/.NET_Core

In November 2020, Microsoft released .NET 5.0 which replaced .NET Framework. The "Core" branding was removed and version 4.0 was skipped to avoid conflation with .NET Framework. It provides native multi-platform support including Linux and macOS and addresses the patent concerns related to the .NET Framework.[19]

It's been a long road, and their history wouldn't inspire confidence in anyone who's been paying attention for a long time, perhaps, but my long-term concerns over the ridiculousness of a "Write-Once-Run-Anywhere-As-Long-As-It-Is-Microsoft-Windows" have been greatly attenuating.

On the business side, the reason it probably makes sense is that Azure brings in a whole lot of their income now, so they are no longer trying to force Windows (and various associated proprietary, closed-source non-independent technologies) down everyone's throats.

It is off-topic, and not really a resolved issue, but the percentage of the important work that Microsoft does that is open-sourced and freely available on GitHub seems both quite large and ever-increasing.  Visual Studio Code, .Net 5, the C++ STL, PowerShell, etc.  There still seems to be significant residual distrust from the long-term members of the FLOSS community.

I agree it is somewhat ironic that Google is moving so hard towards Kotlin for the reasons you suggested (tho, they have an ever-increasing relationship with JetBrains in their toolchain, so, there's that).
 
Jesse Silverman
Ranch Foreman
Posts: 338
8
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Tim:

I vaguely remember people referring to "Effective Java" and the hazards of constants.

I think the risk is that some code that references a constant doesn't get re-compiled, so that if you change that constant in the place it originally got it from it can be cached to the wrong value in its bytecode?

I think they suggested the awkward work-around of always fetching it at runtime?

Ugh!

I thought that kind of stuff was left behind with C/C++ compiling against promises instead of the real code, but maybe that is one of a category of places you could get something like that to happen in Java.

Still nothing compared to my Ruby friend thinking it was somehow "cool" that you could change the value of 1 to be 123_456...
 
Montana has cold dark nights. Perfect for the heat from incandescent light. Tiny ad:
SKIP - a book about connecting industrious people with elderly land owners
https://coderanch.com/t/skip-book
reply
    Bookmark Topic Watch Topic
  • New Topic