# Assignment Result - Groups

My second assignment on groups was marked and returned. As predicted, there's a lot to take away from this.

## Basic Groups

At this level it seems it's not sufficient to use notation like , and to refer to subgroups. In my head, I know what the binary operator of these subgroups is but for the benefit of the reader the (better) convention is to be explicit with the binary operator: , , and specifically.

Definition check: a cyclic group is just a group that contains a generator that generates the whole group. It doesn't have to "cycle" back to the beginning of the ordered set. eg, infinite sets can form a cyclic group under addition.

Mistakes in basic proofs. This one is a classic. With , I had to prove was onto. My answer:

Which I was all smug about because it looks pretty. But this is just the definition of the function being onto and I'd neglected to state the actual that always exists. Namely .

## Group Classification

Group decompositions can't be written as . It needs to be written as . 3 and 15 are not coprime, so cannot be combined.

## Finite Groups

As a part of a group presentation, there are kind of identities like . However, for the most part in your answer, especially when you're talking directly about elements being in your group, they must always be in the form (with first).

In being asked to deduce an isomorphism, I heavy-handedly defined both groups (as they were small) and then also stated a function explicitly that would define the isomorphism. Apparently it's just enough to state that both groups were of order 2. Therefore they're isomorphic.

I need to become a lot more familiar with the concept of centres of a group. To show that a subgroup is the centre of its parent group, I needed to show that the subgroups elements were centres of the subgroup. I was was just trying to wing it, and although I essentially got the answer right, I showed I had a complete misunderstanding of centres.

# Chapter 4 of 24 - Fermat's and Wilson's Theorems

Okay. Another chapter down. Very much "congruence continued".

Sections I covered were Fermat's Little Theorem, representation of fractions by decimals, Wilson's Theorem, and polynomial congruences. On top of that, I've managed to complete my first assignment based on these first four chapters. Reviewing that feedback will be very interesting.

One thing I have noticed is that the materials in the book are far more challenging than the questions in the assignment. I wonder if I'm being lured in to a false sense of security here. I suppose I'll only know when I review the past exam papers for this module...

Can't quite believe I've reached this point as the next chapter is "Examples of groups". Hello group theory my old friend!

# Chapter 3 of 24 - Congruence

I've worked with congruences before, but this chapter was a lot more fun than I thought it would be. Introduction to properties, classic divisibility tests and simultaneous linear congruences.

Questions in this chapter seemed a lot more achievable. Though looking at the assignment, the related question seemed completely out of my reach. It was that feeling of fear you normally get in an exam when you see a question you don't understand and your brain shuts down. I've got some work to do in this area, as I feel I've been experiencing that feeling fairly consistently in the last two exams I've had (stats and complex analysis). -both of which have been covid lockdown at-home exams.

What was good though was in this chapter there was some "highest common factor" questions thrown in there, so that was really useful. Good for building confidence.

It seems there's more congruence work in the next section, "Fermat's and Wilson's Theorems".

# Chapter 2 of 24 - Prime Numbers

Right. Another very tough chapter complete.

Well I say complete. I did have to leave out a number of the exercises again. Of the answers to the exercises that I understand, the answers in most cases are inspired. There's simply no way I could ever make the logical bounds demonstrated in those answers.

I was so disturbed by the difficulty of the exercises in Chapter 1, I decided to consult my tutor about it. He replied explaining that a lot of the exercises are meant to teach you by you looking at the answers so you're kind of meant to get stuck. This made me feel a lot better, but now it's just down to the time I can spend reviewing and learning from each answer.

But yes. Prime numbers. Fun chapter! Covered an intro to The Primes, the prime decomposition of integers, the infinitude of primes, famous problems concerning primes, and Fibonacci numbers. Fibonacci numbers are far more interesting than I ever realised, it turns out.

Okay. Time to move on to Congruence...

# Chapter 1 of 24 - Foundations

Well 2021 has gone quickly hasn't it?

The workload in complex analysis became so large that I couldn't contribute to this blog for the rest of the academic year. This is frustrating as I felt there were some really important things to learn from my marked assignments. If I ever find time to write them up, I will do.

Though needless to say, it's suddenly October. My new (and final) module begins! M303 Further Pure Mathematics. This will be a big one. It's a final-stage double-credit module and contains an absolutely enormous amount of material.

Last week I received an email from my tutor that could be summarised to two words: "GET AHEAD". Of course that's when the fear struck, so I decided to be tactical with the first chapter. There were two sets of exercises I didn't attempt, because I realised I knew enough about the material to answer the assignment questions. However practically all of the exercises I did try, I couldn't complete. At this early stage, I'm perhaps feeling it might've been a mistake to choose this module. Having said that, I have been able to answer the first two questions of the assignment.

The first chapter was about number theory. Proof by Mathematical Induction, highest common factors, lowest common multiples, the Euclidean Algorithm, and Diophantine equations. Not only all that, but all the theories, definitions, propositions and lemmas that go along with them.

I feel this was so difficult because you need to use all the theories and definitions like a palette of different paints. In the same way as the artist it feels like the mathematician needs to use the theories and definitions to paint a picture. Problem being, it felt like I was being taught what the primary colours were for the first time. Still, we've moved on, and I've marked the exercises I need to return to. I'm hoping that down the line I'll be able to return to them and they'll make a bit more sense to me.

Deep breaths.

Next chapter? Prime numbers...

# Complex Analysis Assignment 1

My first complex analysis assignment has been marked and returned. I don't think I've ever felt the urge quite so much to learn from my mistakes.

Consequently there has been quite a lot of post-assignment learning... :/

This assignment featured a very brief introduction to complex numbers as a refresher, then broadly covered complex functions, the concept of continuity and complex differentiation.

So in no particular order, below are some notes on mistakes I made and how I could've avoided them! There's a lot to reflect on here...

Read questions carefully. One of the first very simple questions read "express in polar form and determine all fourth roots". I did the second bit, but not the first.

I feel this is a bit "Complex Numbers 101", but the square root sign is defined as the principal square root (of a complex number), i.e. there's no need to calculate the second root.

If you're using the triangle inequality, state it specifically.

Again, this is fairly "Complex Numbers 101", but the polar form of a complex number isn't just a cosine function as the real part, and a sine function as the imaginary part. The arguments to both functions must be identical to qualify as "polar form". ie, you should be able to write the complex number as an exponential form.

Top tip: Be mindful about using identities. In complex analysis there are loads of them and they help a great deal.

When working out the inverse of a complex function, it's important to use your common sense. Part of one inverse I'd calculated had a square root in it. Just by looking at that, you know it could never produce a unique answer (it isn't a one-to-one function).

For another, I had to find the inverse of and the domain of that inverse. I got this spectacularly wrong. I'd written: given , hence .

Trick here was to exponentiate each side, leading to . But the domain of the inverse isn't affected by the "3" above, the image set of the original function is still .

Some complex functions are very very different to their real equivalents. Case in point: , but . Which leads to the next note:

If is the divisor in a complex quotient, you need to show that it's only 0 for values outside of the given range of the equation (eg ).

For one question, I had to prove that was continuous. I thought this was easy.

is a basic continuous function on . So if you let , then is continuous, right?

Not quite. I had entirely forgotten to state that the given set is a subset of the set I gave: .

The answer can appear obvious sometimes, but you have to keep your answer rigorous, otherwise you risk losing half marks or whole marks here and there.

Note:

🙁

For one question I had to prove whether a set was a region or not. For reference, a region is a non-empty, connected, open subset of . In the usual manner, if you can prove that any of those three properties don't hold then you've managed to prove that your set isn't a region. Easy.

I realised I could prove a set was closed, and hence not a region. Turns out this was incorrect. A set being "closed" and a set being "not open" hold two completely different definitions, and are seen as different things. I was meant to show it was "not open" as opposed to showing it was "closed".

In other words, mathematically:

Closed is not the same as not-open.
Closed is not the opposite of open.
Not-open is the opposite of open.

Again, here I needed to provide a proof based on the properties of various objects. Given a set that was compact (closed and bounded), I needed to prove that a function was bounded on that set.

The Boundedness Theorem states that if a function is continuous on a compact set, then that function is bounded on that set.

The function was:

I proved that the given function was continuous on it's domain, but I'd failed to prove it was continuous on the set. Here, I needed to show where the function was undefined, THEN show that those points at which it was undefined all lay outside of the set. So there was quite a lot of work I missed out from this answer.

My simultaneous requirement for the Cauchy-Riemann theorem, AND the Cauchy-Riemann Converse theorem within a proof ended up not flowing very well logically. Once again, I'd jumped ahead with my logic. As soon as I had seen something obvious, I felt the urge to state it immediately.

The Cauchy-Riemann theorem proves that a function is not differentiable at certain points. The Converse theorem then proves that a function IS differentiable on certain points. After using the Cauchy-Riemann theorem, it was extremely obvious where the function was differentiable, so I stated it. Then, as a matter of course, plodded through the Converse theorem to prove it. Complete lack of discipline! 🙂

# Complex Functions: Domains, Image Sets and Inverses

I can imagine having to refer to these notes regularly, so I'm putting them here!

# Image Sets

1. State the domain of .
2. Rearrange so is a function of (to discover the condition under which remains valid.

e.g., for :

# Domain of Combined Functions

Domain of combined functions are the intersection () of the domains of all component functions and that of the combined function. e.g:

# Domain of Composite Functions

For and with domains and respectively, the domain of is:

e.g., for:

# Inverses

1. Determine image set of .
2. Invert to find a unique in the domain of .

For

(all same as above for finding an image set)

gives a unique soluition in , hence has a unique inverse rule:

# Feedback - 01

I received the marks back for my first monster assignment! Did quite well as it turns out! But this blog isn't about spouting about my success, it's about the learning process! So here's some of the things I screwed up...

First off, my algebra is clearly rusty as fuck. In one instance put a minus sign in the wrong place AND mysteriously lost a factor of 2 in the progress of my working. In future I really need to re-read my working really carefully (three or four times over it seems), both the hand-written and the full typed-up LaTeX...

Something else I lost marks for was the apparently simple task of graph sketching, either where I hadn't considered asymptotes or had not considered the limits of the domain. Overall I clearly need to be a lot more mindful of whether I'm dealing with or . When I read those symbols I see them both so often, I frequently gloss over them without properly considering their usage. Again, pretty basic stuff.

With complex numbers I apparently need to be more explicit with my declaration of forms. My polar form was implicit in the answer, but there wasn't anywhere I actually stated it. Silly boy.

I fell down on a proof of symmetry for an equivalence relation. I just wasn't mindful whilst answering this. It is assumed that . This can be rearranged in terms of as . So substituting y, in the symmetrical results in: . Of course, at this point, proving that what's inside the brackets is an integer is pretty difficult. But that's where I left it. A bit more play would've shown that I could easily have arranged the first equation in terms of instead which would've resulted in , which is rather obviously an integer given the initial variables. More exploration required in future...

Lastly, in my last post I mentioned how there was a distinct lack of symbolic existential or universal quantifiers in all this new material. After Velleman, I was so used to seeing them, and working with them appropriately but because they're now not around, I got totally burnt by assuming I had to prove "there exists" instead of "for all" for one question. I suppose I'll be able to get around this with making sure my notes explicitly state whatever quantifier we're actually talking about. Damned English language... Symbols are much more concise! 🙂

# Large Intro

Finally submitted my first assignment. It was monstrous. Just over 23 pages of mathematics and sketches of graphs. All of it typed up in LaTeX. Skipping ahead to look at the rest of the assignments, it looks as if this first assignment may very well be the biggest of the whole lot. This is a very good thing as I really don't think I could churn out that much work of a high quality every month.

Glad to say that most of this introduction section I was familiar with. Only really new topic was equivalence relations, which caused some problems initially.

Overall though, what I've found difficult is the apparent lack of logical notation. After reading "How To Prove It" I've become half-decent at making sense of and rearranging logical notation to solve a problem. The difficulty comes in looking at the plain-English description of something in the texts and then having to translate it into logical notation to allow my fussy brain to think about them logically.

Perfect example of this is the definition of a function being "onto". In the text, the definition reads:

"A function is onto if ".

Which is fine, but the Wikipedia definition reads:

" such that "

Which for me, gives me a much better idea about how to go about proving if a function is onto. Why leave out the quantifiers? The Wikipedia definition tells me so much more. I suppose translating English into logical notation is just something I'll have to get good at!

Though even after this long intro section, I really feel I need more practise with proofs... I guess this may have to wait until revision time... Next up is the first section on group theory, with an assignment due on November 24th. Onward.

# PENS DOWN!

So that's it! The Summer has ended!

How far has my extra study got me? Well I've managed to get through around 120 pages of "How To Prove It" by Velleman, and have generated just over 70 double-sided pages of A4's worth of exercises from the book. Not bad for an extra-curricular topic!

This book has helped me loads. It's succeeded in taking away a lot of the mystery involved in reading and writing proofs.

Every topic in the book up until now has flowed well, and allowed me to think about solutions to the various problems fairly naturally. What I mean by that is, I never became absolutely stuck and unable to answer a question.

Having said that, the sub-topic I'm finishing on is proofs involving quantifiers. This is the one area in which I'll admit I've been struggling. At this point in the book, I've learned so much about the number of ways in which to decon/reconstruct a problem that any possible method by which to prove a theorem has actually become less obvious.

Here's an example of how convoluted the scratch work of a basic proof has become. Here's question 14 from p.122:

Suppose  is an indexed family of sets. Prove that .

It's a short question, but this immediately looks like a nightmare to a beginner like myself. We've got a mix of indexed sets, a union over them, and power sets.

First off I need to properly understand the damn thing. Seems sensible to draw up an example using the theorem...

Let's say is . So we've got .

Now let's say that and .

Looking at the LHS of the thing I need to prove, it's actually pretty easy to break down:

Which means that the union of all the elements of the power sets of is:

A little half-way recap: the theorem says that in my example, should be equal to, or be a subset of (the RHS).

Let's see if that's true shall we?

Within the parenthesis of the RHS we've got . So this is the union of all elements of all indexed sets. In this example:

Only thing missing now is the power set of this:

And there we go. Now I understand exactly what the theorem means. , in this specific example, turns out to be :

which is obviously true. Theorem understood. Achievement unlocked. Tick.

As recommended by Velleman, I'll also try to construct the phrasing of the proof along-side the scratch work as I go. Like so:

Suppose a thing is true that will help to prove the theorem.

[Proof of theorem goes here]

Thus, we've proved the theorem.

Okay, let's start.

The theorem means that if then . So one thing implies the other. We can then class as a "given", and aim to prove as a "goal". So blocking out our answer:

Suppose that .

[Proof of goes here]

Thus, .

So let's start analysing , remembering not to go too far with the logical notation. With baby steps, the definition of a union over a family of sets (here, the outer-most part of the logic) is:

Then, going one step further, using the definition of a power set:

Now we could go further at this point, applying the definition of a subset, but I'll stop the logical deconstruction here. In this instance, I've found that if I keep going so the entire lot is broken down into logical notation it somehow ends up getting a bit more confusing that it needs to be.

With this as our given, I notice the existential quantifier. Here, I can use "existential instantiation" to plug any value I want into and then assume that what follows is true. So at this point the new "given" is simply:

Nice and simple.

Let's update the outline of our proper proof answer:

Suppose that .

Let be such a value that .

[Proof of goes here]

Thus, .

So let's now move on to our "goal" that we have to prove: .

Again, starting from the outside, going in,

by the definition of a power set becomes:

I can't do a lot with this on it's own so I'll keep going with the logical deconstruction. By the definition of a subset, this becomes:

So now, using "universal instantiation" I can say that here, is arbitrary (for the sake of argument, it really can be anything), and that leaves us with an updated "givens" list of:

and

and a new "goal" of

Hey, but wait a sec... Look at our "givens"! If is in ... and is a subset of , then must be in ! -and that's our goal!

So update our proof:

Suppose that .

Let be such a value that .

Let be an arbitrary element of .

[Proof of goes here]

Therefore . As is arbitrary, we can conclude that .

Thus, .

So let's wrap this up.

Theorem:
Suppose  is an indexed family of sets. Prove that .

Proof:
Suppose that . Let be such a value that , and let be an arbitrary element of . But if and , then . Therefore . As is arbitrary, we can conclude that . Thus, we conclude that .

Overall the task has involved unravelling the symbols into logic, making sure they flow together, and then wrapping them back up again.

See what I mean by convoluted? All that work for that one short answer. I must admit, I still don't know if my reasoning is 100% correct with this. Despite some parts of this seeming simple, this really is the very limit of what I'm capable of understanding at the moment. I picked this example to write up, as so far I've found it to be one of the most complicated.

The next section of the book seems to marry this quantifier work with another previous section about conjunctions and biconditionals, that I found to be quite enjoyable at the time. Then towards the end of the chapter, Velleman seems to sneak in some further proof examples using the terms epsilon and delta. I imagine this is a sneaky and clever way to get the reader comfortable with further Analysis study...

Alas, my study of Velleman's book will have to stop here. I understand a lot more than I did, though not everything there is to know. I feel it may be enough to give me a slightly smoother ride through my next module, which was the whole point of me picking this book up. It's been so good, I hope I have a chance to return to it. I feel later chapters would put me in an even better position for further proof work!

For now... the countdown begins for the release of my next module's materials!