Oct 30 2019

How I became an AWS Certified Cloud Practitioner

Category: Funny,TechnicalIuliana @ 2:08

This will not be a technical post instructing you how to learn to pass the certification. Because I, myself I passed the exam by accident. Because I scheduled the exam by accident. But let’s go back ti the beginning.

A few years ago, 2014 or 2015, Rpx quit working for Microsoft and therefore he lost access to the VM this blog was hosted on. So, in order to keep it, I bought a Reserved Instance from Amazon and installed everything there. Why an instance in the Amazon cloud and hot a cheap special WordPress hosting service?

Because I wanted to get more comfortable with Amazon cloud. And because the only way I knew how to install & configure Apache, Mysql and WordPress, was … manually. And I liked doing it. I still like doing it, even if probably I’m not that good at it. But since moving my blog to Amazon cloud, I’ve survived two hacking attempts, me experimenting and mucking up file permissions that WordPress barely worked anymore and random MySQL failures.

When I was looking for a new job, I was not looking for a cloud engineer job. I was looking for anything that would allow me to finally make more money out of my Spring expertise. But oh well, sometimes people just click and so far I’m convinced I made the right choice.

Thus I am now starting to shift from Java/Spring expert towards … full-stack, or better said Jack-of all-trades, a title that was given to me at the beginning of my career and kinda limited my job selection at the time; because apparently it was more valuable to be an expert on a single domain, than juggling with everything. It’s quite ridiculous that after managing to finally stick to a niche for a ten years, my initial Jack-of all-trades skill might have gotten it me paid better if I would just have stuck to it. But oh well, it is what it is.

The company I currently work for is an Amazon partner, but AWS certifications expire, so after some people left the company and/or the certifications of those that stayed expired, the company found itself in danger of losing the partner status for not having enough certified AWS certified people employed. And so, the latest three people that were hired, had to become certified. I am one of those people.

So I’ve started preparing. And I panicked, because I realized I haven’t learned for an exam in … 12 years. And the information you need to accumulate to pass the certification is basically a detailed manual on how to use Amazon services wisely. And they provide a lot of services, for … well… anything. And it is not logical, it cannot be structured or organized in some programmatic way, it is not about designing or implementing anything, it’s more similar to the driving license theoretical exam. And I hate this kind of exam. My mind works very well with information that can be associated, connected to existing information that is not part of the foundation of my expertise; because the new information is connected and inferred from existing information. But the AWS training material … its very hard to associate with anything. So… I read and I wrote and watched the video training samples and still I had the impression that I am retaining … nothing.

After my much-smarter and more logical and structured colleague passed the exam, I just logged into the AWS account and checked to see when I could schedule my exam too. Well, I’m not sure what I did, or maybe my Firefox trolled me, but aside from an exam date four days away, the next one was three weeks away. And being already panicked that I am not retaining information I feared forgetting anything in three weeks. So I scheduled my exam on the 25th of October, at the time I had no other choice. And I did this on Monday the 21st of October. I spent the next three days reading, writing, listening to those video tutorials again and panicking. In a way, whatever the result, at least I would be able to take a break from reading Amazon propaganda. Because this is 90% of the training material.:))

And luckily, I passed.

After that, I talked to my college and told him why I scheduled the exam so rashly and he showed me on his computer the calendar with available dates and well … there were a lot more dates available than what I saw.

So yeah, I scheduled myself by mistake, quite rashly for the AWS Cloud Practitioner’s exam. I was definitely not completely prepared for it. But apparently it was enough. And now I can take a break from reading about how to use AWS services and actually solve some useful tasks.

Lesson learned: Some mistakes are worth making.

All is well with the world.

Stay safe, stay happy!


Sep 17 2019

1984 = 2019

Category: MiscellaneousIuliana @ 2:14

I told you that I’ve read 1984 a while ago. I am still a little bit annoyed with the world that the book depicts. What I did not mention was that before that I went through Brave New World, Fahrenheit 451 and  Animal Farm.

And I ended up to a conclusion: these books can be put in a sequence. Imagine this: a world like the one depicted in Animal Farm leads to a world like the one depicted in Fahrenheit 451, that in turn leads to a world like the one described in 1984 and then finally to a world like the one in Brave New World.

I had planned to write a longer article than this about these books and how they seem to connect to one another, but something happened tonight that pissed me off, because I think we are closer to the world in 1984 than we thought. My only consolation is that Brave New World is close and if I survive until then they will pomp me up with Soma big time to keep me docile. :D

Few of you know that this blog until about three years ago was hosted on http://rpx.kicks-ass.net. Even fewer than you know that I used to write also on http://seaqxx.kicks-ass.net. And that is because my then boyfriend owned the kicks-ass.net domain.

Tonight I was having a private conversation with a friend, and Facebook decided to stop me from sending a message with the URL of my old blog because …

So yeap. They are not even hiding the fact that they are reading your private messages anymore. But then again, as long as your messages are stored in their databases, they are not really private, are they?

Take it as you want it, but the next person that tells me that what politicians and corporations do doesn’t really affect me, gets a kick in the teeth.

Anyway, as you can imagine now, me and my friend are having a very dirty conversation to check how restrictive the bot is. Because… engineers.

Stay safe, stay happy and keep your stuff private. If you can.


Jan 06 2016

This is what I do

Category: TechnicalIuliana @ 22:25

When everybody was going on vacation me and a few other colleagues, stayed behind in order to perform the migration from CVS to Git of our very large project. We used the wonderful cvs2git tool, although a lot of internet reports say that the results are unpredictable. The same thing I mentioned during the preparatory meetings, but for the first time since I work in this company apparently there were people that were more optimistic than me, because on the 23rd of December the migration began. A little bit earlier than everybody expected, but oh well…

iuliana-rambo

We had one big CVS repository, so the first step to do was to restructure our project and split it into little ones that could be easily migrated. Issue was, that one project could not be split. And that was the one that caused a lot of trouble. When I am writing this post, that project is still being migrated. And is migrated a little different than others. Each branch of the CVS repo, becomes a Git repo. Then all these repositories will be merged into one. And all my colleagues recommended me to use this and that, a lot of shell and git commands found on stackoverflow, I had the genius spark to merge these repositories in an instant using multiple remotes. I’ll write more about this in a later post.

Before the vacation started, I trained my colleagues in using Git. If you would ask me, the training was quite a fiasco, because I had only 2 hours per group to explain them what Git is, what are the differences between CVS and Git, how Git works internally, what GitBlit is, how to work with Git using Eclipse and his stupid EGit plugin and how to work around its mishaps. As you can imagine 2 hours were not enough, but it is what it is, I had to work with the resources I was allocated. Knowing exactly how the training went, I took advantage of the free days I had and I slept a lot and prepared myself mentally for 6 months of  answering repetitive, sometimes ridiculous Git questions. I mean, I am expecting for my colleagues to have the most weird questions and I am expecting for them to do the most weird things with Git.

And now, this is the first week. And my responsibilities do not cover only Git consulting, but my project manager is on vacation so I had to take his responsibilities as my own, I had to deliver a fix and I had to prepare the hotfix package for testing and delivery and also help people in the company to update their release/hotfixes scripts to use Git. Fortunately, the hotfix was ready, was tested and will be delivered at the end of the week.

But today a serious problem emerged. People were unable to work with the remote repositories. They got a lot of timeouts, and nobody knew the cause. Logs did not say anything related to that. So we started analyzing everything it could affect this.

We started with GitBlit, all looked fine in the GitBlit.properties file, all ssh properties were set with appropriate values.

Most of us were using the ssh protocol to communicate with the remotes, so we needed to check how many ssh connections the server could handle in parallel. SSH works over TCP, so the  number of TCP connections was just as relevant.

# cat /proc/sys/net/core/somaxconn
128

And it was a small damn one. It was increased to 1024. And it seemed to work for a while, but as soon as everybody started cloning, pulling and fetching, the problem reappeared. So this was clearly not it.

I then started to look at the SSHD server installed on the server. There were two parameters that interested me: MaxSessions(specifies the maximum number of open sessions permitted per network connection) and MaxStartups(specifies the maximum number of concurrent unauthenticated connections to the SSH daemon. Additional connections will be dropped until authentication succeeds or the LoginGraceTime expires for a connection) Both were commented in our /etc/ssh/sshd_config file, so I guess the default value of 10 was used for both of them. So both were set to 1024. (Yes I like this number)

I restarted the sshd service and again for a while everything looked fine. Then the timeouts started again. I started to think that maybe GitBlit did not close the conections successfully and that is why the 1024 quota was reached and timeouts happened. So I started looking at Gitblit again. After some research into each of its properties I found this one:

# The default size of the background execution thread pool in which miscellaneous tasks are handled.
# Default is 1.
execution.defaultThreadPoolSize = 1

And you probably suspect by now… it was modified to 1024. I restarted the Tomcat hosting the GitBlit installation and… voila. Remote operations are now working for my colleagues. Apparently remote operations using the ssh protocol are miscellaneous tasks.

I was doing all these things, while consulting people about Git and my close colleagues were amazed at how serene I was and how well I was handling it all. Actually I think I was a little amazed too, but then I realized that there is nothing to be amazed of. I was prepared for this, I was expecting a hell of confusion and people running around like Dexter(the cartoon character) when his hair was on fire. I was prepared because I am good at this job and because this is what I do.

Tags: , , , ,


Oct 09 2015

Just dev things

Category: TechnicalIuliana @ 16:46

After receiving an email from MyEclipse that begged me to renew my license, I got a little creative. Click on each image in order to enlarge it.
small_eclipse small-love-intellij

Julie pic source


Sep 20 2015

First time in Făgărași mountains

Category: MiscellaneousIuliana @ 13:41

This weekend I interrupted my normal routine to go hiking. I have not done this in 3 years and I missed it a lot so when given the opportunity I took it. I really like getting lost in places that civilization hasn’t managed to touch that much. I love the silence, the tranquility of secluded places in the mountains that not that many persons get to reach. In the below image you can see the mountain route in blue. I went there with two friends that knew the mountain better than I did.
Screenshot 2015-09-20 12.29.50

 

 

 

 

Continue reading “First time in Făgărași mountains”


Feb 09 2014

What java and vlc do to your computer when ran together

Category: TechnicalIuliana @ 11:25

what-java-does

Results from cat /proc/cpuinfo:
4 x

vendor_id	: GenuineIntel
cpu family	: 6
model		: 37
model name	: Intel(R) Core(TM) i5 CPU       M 480  @ 2.67GHz
stepping	: 5
microcode	: 0x4
cpu MHz		: 2667.000
cache size	: 3072 KB
physical id	: 0
siblings	: 4
core id		: 2
cpu cores	: 2
apicid		: 5
initial apicid	: 5
fpu		: yes
fpu_exception	: yes
cpuid level	: 11
wp		: yes
flags		: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr
 sse sse2 ss ht tm pbe syscall nx rdtscp lm constant_tsc arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc
 aperfmperf pni dtes64 monitor ds_cpl vmx est tm2 ssse3 cx16 xtpr pdcm pcid sse4_1 sse4_2 popcnt lahf_lm ida arat
 dtherm tpr_shadow vnmi flexpriority ept vpid
bogomips	: 5321.30
clflush size	: 64
cache_alignment	: 64
address sizes	: 36 bits physical, 48 bits virtual

Results from cat /proc/meminfo

MemTotal:        5973056 kB
MemFree:          287364 kB
Buffers:          283604 kB
Cached:          1446940 kB
SwapCached:        43652 kB
Active:          3517056 kB
Inactive:        1668528 kB
Active(anon):    2678664 kB
Inactive(anon):   841940 kB
Active(file):     838392 kB
Inactive(file):   826588 kB
Unevictable:          28 kB
Mlocked:              28 kB
SwapTotal:       8191996 kB
SwapFree:        7899416 kB
Dirty:               132 kB
Writeback:             0 kB
AnonPages:       3424548 kB
Mapped:           194148 kB
Shmem:             65564 kB
Slab:             247620 kB
SReclaimable:     210876 kB
SUnreclaim:        36744 kB
KernelStack:        4968 kB
PageTables:        52676 kB
NFS_Unstable:          0 kB
Bounce:                0 kB
WritebackTmp:          0 kB
CommitLimit:    11178524 kB
Committed_AS:    7064468 kB
VmallocTotal:   34359738367 kB
VmallocUsed:      365944 kB
VmallocChunk:   34359318848 kB
HardwareCorrupted:     0 kB
AnonHugePages:   1083392 kB
HugePages_Total:       0
HugePages_Free:        0
HugePages_Rsvd:        0
HugePages_Surp:        0
Hugepagesize:       2048 kB
DirectMap4k:      566520 kB
DirectMap2M:     5584896 kB

Yeah … I need a new workstation, because the current one is definetly deprecated.


Apr 13 2013

Linux: connect to VPN (complete)

Category: TechnicalIuliana @ 22:53

Some time ago at work, I was assigned to a new project. To be able to access client specific resources I needed to be able to connect to a VPN. I was given a domain, username, password and a gateway. All was simple in Windows and all resources were accessible. Among these resources there were some servers (testing, acceptance, stuff like that)  which had the application installed and were accessed through the browser via http. (Example: http://server1:8080/application). But when my request to work on Linux was approved, and I received a fresh Linux workstation to configure as I please, I stumbled across a few problems, because any tutorial on the internet  that explains to you how to set up a VPN connection in Linux is incomplete. So, what did I do?

The first step was to  get all the information from Windows that I could. So I clicked right on the VPN connection and made print-screens of all the properties  shown. Then I logged on to my Linux (Fedora 18 at work, Ubuntu 12.10 at home – I am mentioning this because the steps are identical) and proceeded to create my VPN connection according to the steps here, but always taking a look at the print-screens I took in Windows.  Just to make sure, I also asked my colleague who gave me the VPN details in the first place what type of VPN was it and he said:  “ it’s standard Windows VPN, PPTP. Port 1723″

So the steps I took were:

    1. Click right on Network Connections icon , select  VPN Connections, then click on Configure VPN
    2. In the dialogue window that appeared I clicked on the Add button
    3. A new dialogue window appeared asking me to select the type of the VPN connection. I selected PPTP and clicked on the Create… button.
    4. A new dialogue window appeared with two tabs: VPN and IPv4 Settings
    5. In the VPN tab there was an Advanced button. When clicked a new dialogue window appeared with advanced options to select. I checked everything that I found checked in the Windows print-screens and left unchecked everything that was unchecked in them. In my case I had to deselect all authentication methods but MSCHAPv2 and check everything else in the dialogue box except “Send PPP echo packets”.
    6. And now if you save everything, the connection will succeed. But if you need access to some application installed on some servers accessed via their host names, you will need something called DNS suffix which can be added in the “Additional search domains” textbox in the IPv4 tab.
    7. If you don’t know what value to put there, and your colleagues didn’t tell you,(mine did not) you can do the following. Log into Windows and  connect to the VPN.
    8. Open a Command Prompt terminal and execute the following command: nslookup hostname You should get an output similar to this:
            Server:  hostname.somedomain
            Address:  xxx.xxx.xxx.xxx
           
  1. Now, copy somedomain in the IPv4 tab, in the “Additional search domains” textbox and save everything.
  2. If you need the same kind of access I needed, also take a look in Firefox and the proxy it uses. Even if I had a successful VPN connection and a correct DNS suffix, I could not connect to http://server1:8080/application because my Firefox was set by default to “Use system proxy settings“. When I set it to “Auto-detect proxy settings for this network” it worked like a charm.

So, these are the steps that I took. I asked a Linux guru friend (Rpx) for help in debugging the VPN settings, because I am not that good at networking and I thought the additional information I discovered with his help, might be useful to somebody else too and that’s why I wrote this post. I will appreciate any kind of feedback.

Tags: , , ,