Your intuition is exactly right - elementary equivalence (= satisfy the same first-order sentences) is not the same as isomorphism. Moreover, you're right that the place to look is cardinality differences, since these provide the simplest evidence of non-isomorphism (although you don't quite go as far as you could - why not just look at the empty language $\{\}$, structures for which are just pure sets and thus are completely determined up to isomorphism by their cardinality?).
Now, how to prove that?
Below I'll first give four approaches which utilize "big theorems." Without more details I don't quite know why your induction argument breaks down, it really shouldn't, but if you say more about that I'll try to help out there too.
First, we can just use a counting argument. Note that given any language $\Sigma$, there are proper class many nonisomorphic $\Sigma$-structures. Namely, for each cardinality $\kappa$ there are $\Sigma$-structures of cardinality $\kappa$. On the other hand, since our language must always be a set, there are only set-many complete $\Sigma$-theories. Pigeonhole principle, class-vs.-set version: some nonisomorphic structures have the same theory.
Maybe more concretely, there are at most (for example) continuum-many complete theories in the empty language $\{\}$, but there are more-than-continuum-many non-isomorphic $\{\}$-structures (that is, non-equinumerous sets).
Here, the big theorem we're using is Cantor's theorem - more specifically, the corollary that there is no largest set (together with the fact that any set can be turned into a structure for any language, by interpreting the symbols in a silly way, but that's not really interesting).
Now that's a bit weird. Can we do something more natural?
The simplest reasonable proof is to use a theorem you may already know, namely the compactness theorem. For example, let $M$ be an infinite structure, and consider the following theory $T$:
The language of $T$ is the language of $M$, together with new constant symbols $d_i$ for $i\in I$ where $I$ is an index set with $\vert I\vert>\vert M\vert$.
$T$ itself consists of $(i)$ the first-order theory of $M$ and $(ii)$ the sentence $d_i\not=d_j$ whenever $i\not=j$.
Since $M$ is infinite, $T$ is finitely satisfiable (why?) and hence has a model by compactness; this model (or rather its reduct to the original language of $M$) is elementarily equivalent to $M$ (by $(i)$) but not isomorphic to $M$ since it's too large (by $(ii)$).
Another powerful theorem you may already know is the downward Lowenheim-Skolem theorem. This also gives counterexamples: let $A$ be any uncountable structure (in a countable language) and apply downward Lowenheim-Skolem to argue that there is a countable (hence $\not\cong A$) structure $B$ which is elementarily equivalent to $A$.
Yet another useful theorem in this context is the game characterization of elementary equivalence due to Ehrenfeucht and Fraisse (if memory serves). It's relatively easy to cook up a winning strategy in the relevant games for, say, two pure sets of different infinite cardinalities. I also like these arguments since they give us a bit more insight into what makes the structures tick.
Best Answer
For your first question, there is a lot to say about the issue (which is related to the issue of intensional vs. extensional equality) but one important difference is simply that logical equivalence and equality apply to different types of things; the former applies to logical formulas and the latter applies to terms. If we were to use the symbol "$=$" for both, then the rules of any formal system for manipulating expressions with "$=$" would be more complicated because they would have to split into cases depending on which type of thing the "$=$" applied to.
Another issue with your proposed expression $(2x=3)=(x=3/2)$ is the use of parentheses to serve as quotation marks for logical formulas, which could invite confusion with the more usual use of parentheses.
For the second question, I think the reason is that
\Leftrightarrow
is a generic symbol that can be used for lots of different things, and\iff
is for logical equivalence, which is a binary operator with low precedence in the order of operations, so it is elongated in order to provide a large visual separation between the things on either side.